US20170253177A1 - Vehicle lighting system - Google Patents
Vehicle lighting system Download PDFInfo
- Publication number
- US20170253177A1 US20170253177A1 US15/444,988 US201715444988A US2017253177A1 US 20170253177 A1 US20170253177 A1 US 20170253177A1 US 201715444988 A US201715444988 A US 201715444988A US 2017253177 A1 US2017253177 A1 US 2017253177A1
- Authority
- US
- United States
- Prior art keywords
- pattern
- vehicle
- information
- projection
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 claims abstract description 5
- 230000002093 peripheral effect Effects 0.000 claims description 23
- 238000004458 analytical method Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 2
- 238000012545 processing Methods 0.000 description 21
- 238000004891 communication Methods 0.000 description 20
- 230000000694 effects Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000000034 method Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/50—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
- B60Q1/525—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking automatically indicating risk of collision between vehicles in traffic or with pedestrians, e.g. after risk assessment using the vehicle sensor data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B60K35/28—
-
- B60K35/60—
-
- B60K35/85—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/02—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
- B60Q1/04—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
- B60Q1/06—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/02—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
- B60Q1/24—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments for lighting other areas than only the way ahead
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/02—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
- B60Q1/24—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments for lighting other areas than only the way ahead
- B60Q1/249—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments for lighting other areas than only the way ahead for illuminating the field of view of a sensor or camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/50—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/005—Traffic control systems for road vehicles including pedestrian guidance indicator
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B60K2360/175—
-
- B60K2360/178—
-
- B60K2360/179—
-
- B60K2360/334—
-
- B60K2360/5915—
-
- B60K2360/797—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q2300/00—Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
- B60Q2300/40—Indexing codes relating to other road users or special conditions
- B60Q2300/45—Special conditions, e.g. pedestrians, road signs or potential dangers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q2400/00—Special features or arrangements of exterior signal lamps for vehicles
- B60Q2400/50—Projected symbol or information, e.g. onto the road or car body
Definitions
- the present disclosure relates to the technical field of a vehicle lighting system that presents information to a target by projecting light around a host vehicle.
- JP 2008-143510 A proposes a technology that detects a target around a vehicle and alerts the target by projecting the projection light of a predetermined pattern toward that target.
- JP 2009-282564 proposes a technology that expands the projection range of the headlight further forward than usual when it is determined that a vehicle is likely to collide with a pedestrian.
- JP 2008-143510 A Japanese Patent Application Publication No. 2008-143510 A
- the present disclosure provides a vehicle lighting system that can present information appropriately to a target even when there is a plurality of vehicles capable of presenting information to the target by a projection light.
- a vehicle lighting system in an aspect of the present disclosure includes a first detection unit configured to detect a moving body around a host vehicle, a first determination unit configured to determine whether the moving body is a target to which an alert should be sent, a projection unit capable of projecting a first pattern, which indicates first information on the host vehicle, in a predetermined range around the target when the first determination unit determines that the moving body is the target, a second determination unit configured to determine whether another vehicle is projecting a second pattern, which indicates second information on the other vehicle, in the predetermined range around the target, and a control unit configured to control the projection unit such that the first pattern is projected at a position where the predetermined range around the target and the first pattern does not overlap with the second pattern or such that the first pattern is not projected, when the second determination unit determines that the other vehicle is projecting the second pattern.
- the vehicle lighting system of the present disclosure either the first pattern projected from the host vehicle is projected at a position where the predetermined range around the target and the first pattern does not overlap with the second pattern or the first pattern is not projected, when the other vehicle is projecting the second pattern. Therefore, the vehicle lighting system of the present disclosure reduces the possibility that overlapping between the first pattern, projected by the host vehicle, and the second pattern, projected by the other vehicle, prevents appropriate information from being presented to the target.
- control unit may be configured to control the projection unit such that the first pattern is projected at the predetermined range around the target when the second determination unit determines that the other vehicle is not projecting the second pattern.
- the first information on the host vehicle can be presented to the target.
- the vehicle lighting system may further include a difference calculation unit configured to calculate difference information that is included in the first information but is not included in the second information.
- the projection unit can project a third pattern, which indicates the difference information, in addition to the first pattern, and the control unit may be configured to control the projection unit such that, instead of the first pattern, the third pattern is projected at the predetermined range around the target and the third pattern does not overlap with the second pattern, when the second determination unit determines that the other vehicle is projecting the second pattern.
- the difference information can be presented to the target without interfering with the projection of the second pattern.
- each of the first information and the second information has information indicating priority
- the vehicle lighting system may further include a third determination unit that determines the priority of the first information and the second information.
- the control unit may be configured to control the projection unit such that a request is sent to the other vehicle to stop the projection of the second pattern and, after that, the first pattern is projected, when the second determination unit determines that the other vehicle is projecting the second pattern and that the third determination unit determines that the priority of the first information is higher than the priority of the second information.
- the projection unit can project the second pattern and a fourth pattern, which indicates both the first information and the second information, in addition to the first pattern, and the control unit may be configured to control the projection unit such that the second pattern or the fourth pattern is projected instead of the first pattern, when the second determination unit determines that the other vehicle is not projecting the second pattern.
- the second information on the other vehicle can be presented from the host vehicle by projecting the second pattern or the fourth pattern.
- the first determination unit may be configured to determine whether the moving body is the target, based on contact possibility between the host vehicle and the moving body or peripheral visibility of the moving body.
- the first determination unit may be configured to determine whether the moving body is the target, based on contact possibility between the other vehicle and the moving body or the peripheral visibility of the moving body.
- the second determination unit may be configured to determine whether the other vehicle is projecting the second pattern based on an analysis result of captured image data.
- FIG. 1 is a block diagram showing a configuration of a vehicle lighting system in a first embodiment
- FIG. 2 is a top view showing an example in which a host vehicle projects a projection pattern toward a pedestrian;
- FIG. 3 is a top view showing a reference example in which the host vehicle and another vehicle project projection patterns toward the same pedestrian;
- FIG. 4 is a flowchart showing the flow of the operation of the vehicle lighting system in the first embodiment
- FIG. 5 is a flowchart showing a target determination method using contact possibility
- FIG. 6 is a flowchart showing a target determination method using peripheral visibility
- FIG. 7 is a flowchart showing the flow of the operation of a vehicle lighting system in a second embodiment
- FIG. 8 a flowchart showing the processing flow of first projection control
- FIG. 9 is a top view showing an example in which information on the host vehicle and the other vehicle is projected.
- FIG. 10 a flowchart showing the processing flow of second projection control
- FIG. 11 is a top view showing an example in which a projection pattern is projected at a position not overlapping with the position of a projection pattern from the other vehicle.
- FIG. 12 is a top view showing an example in which the difference information indicating the difference from the projection pattern of the other vehicle is projected.
- a vehicle lighting system in the first embodiment is described with reference to FIG. 1 to FIG. 6 .
- the configuration of the vehicle lighting system, the problems that may be generated when using a projection pattern, the operation of the vehicle lighting system, and the technical effects achieved by the vehicle lighting system are described sequentially.
- FIG. 1 is a block diagram showing the configuration of the vehicle lighting system in the first embodiment.
- a vehicle lighting system 10 in the first embodiment mounted on a vehicle such as an automobile, is configured to be able to project a predetermined projection pattern on the road surface by projecting light.
- the vehicle lighting system 10 includes an in-vehicle camera 110 , a radar 120 , a communication device 130 , a sensor group 140 , an electronic control unit (ECU) 200 , and a projection unit 300 .
- ECU electronice control unit
- the in-vehicle camera 110 is a camera capable of capturing the area ahead of the vehicle (in other words, the area corresponding to the field of view of the driver).
- the in-vehicle camera 110 may be a camera that captures an image using visible light or a camera that captures images using non-visible light (for example, infrared light).
- An image captured by the in-vehicle camera 110 is output to an information acquisition unit 210 as image data.
- the radar 120 is a radar capable of recognizing an object that is present in the area ahead of the vehicle.
- the radar 120 is configured to be able to detect the position and the moving speed of an object around the vehicle by using millimeter waves or a laser.
- the information on an object detected by the radar 120 is output to the information acquisition unit 210 .
- the communication device 130 is configured to be able to receive information on the surrounding environment of the vehicle by radio communication. More specifically, the communication device 130 carries out vehicle-vehicle communication, road-vehicle communication, or pedestrian-vehicle communication to receive information on the other vehicles and pedestrians. The communication device 130 may also be configured to be able to send information on the host vehicle. The information received by the communication device 130 is output to the information acquisition unit 210 .
- the sensor group 140 includes a plurality of sensors capable of detecting the status of the host vehicle.
- the sensor group 140 includes a vehicle speed sensor, an acceleration sensor, a yaw sensor, and the like.
- the information indicating the status of the host vehicle, detected by the sensor group 140 is output to the information acquisition unit 210 .
- the ECU 200 a controller unit having an arithmetic circuit such as a central processing unit (CPU), is configured to be able to control various operations in the vehicle.
- the ECU 200 in this embodiment is configured to be able to perform control for drawing a projection pattern that will be described later.
- the ECU 200 includes the following units as logical or physical processing blocks that are realized therein: the information acquisition unit 210 , a projection target determination unit 220 , an other vehicle projection determination unit 230 , and a projection determination unit 240 .
- the information acquisition unit 210 is configured to be able to acquire the information, output from each of the in-vehicle camera 110 , the radar 120 , and the communication device 130 , as the surrounding environment information (that is, the information indicating the surrounding environment of the vehicle, in particular, the information on the objects around the vehicle).
- the surrounding environment information acquired by the information acquisition unit 210 is output to the projection target determination unit 220 .
- the information acquisition unit 210 together with the in-vehicle camera 110 , the radar 120 , and the communication device 130 , functions as a specific example of the “detection unit (detection means)”.
- the information acquisition unit 210 is configured to be able to acquire the information, output from the sensor group 140 , as host vehicle information (that is, the information indicating the status of the host vehicle).
- the host vehicle information, acquired by the information acquisition unit 210 is output to the projection target determination unit 220 together with the surrounding environment information.
- the projection target determination unit 220 is configured to be able to detect moving bodies, which are present around the vehicle, using the surrounding environment information received from the information acquisition unit 210 .
- the “moving body” mentioned here means not only an object that is actually moving but also an object that is likely to move. For example, pedestrians who are stopped and other vehicles that are stopped may be detected as moving bodies.
- the projection target determination unit 220 is configured to be able to determine whether the detected moving body is a target.
- the “target” mentioned here is a target to which an alert should be sent using a projection pattern. For example, a pedestrian who is crossing in front of the host vehicle on the road, on which the host vehicle is traveling, is a target.
- the information on the target determined by the projection target determination unit 220 (for example, information indicating the attribute and the position of the target, the movement direction, the movement speed, etc.) is output to the other vehicle projection determination unit 230 .
- the projection target determination unit 220 is a specific example of the “first determination unit (first determination means)”.
- the other vehicle projection determination unit 230 is configured to be able to determine whether the other vehicle is projecting a projection pattern to the target determined by the projection target determination unit 220 . More specifically, the other vehicle projection determination unit 230 determines whether the other vehicle is projecting a projection pattern, based on the analysis result of the image data captured by the in-vehicle camera 110 or based on the vehicle-vehicle communication carried out by the communication device 130 . In addition, the other vehicle projection determination unit 230 may be configured to be able to determine not only whether the other vehicle is projecting a projection pattern but also what type of information the projected projection pattern presents. The determination result of the other vehicle projection determination unit 230 is output to the projection determination unit 240 .
- the other vehicle projection determination unit 230 is a specific example of the “second determination unit (second determination means)”.
- the projection determination unit 240 is configured to be able to determine whether to project a projection pattern from the host vehicle, based on the determination result of the other vehicle projection determination unit 230 .
- the projection determination unit 240 may be configured to be able to determine not only whether to project a projection pattern from the host vehicle but also the specific projection mode of the projection pattern.
- the specific determination processing performed by the projection determination unit 240 is described in detail in the description of the operation that will be given later.
- the determination result of the projection determination unit 240 is output to the projection unit 300 .
- the projection determination unit 240 is a specific example of the “control unit (control means)”.
- the projection unit 300 includes a light (for example, the headlight of the vehicle) capable of changing the direction, and the pattern, of light to be projected.
- the projection unit 300 is configured to be able to project a predetermined projection pattern in a predetermined range around a target based on the determination result of the projection determination unit 240 .
- the “predetermined range” mentioned here is the range in which a projection pattern, drawn on the road surface by the light from the projection unit 300 , is clear enough to be recognized by the target. That is, the “predetermined range” is the range in which the target can visually recognize the predetermined projection pattern.
- the projection unit 300 may have a function to change the projection position to an appropriate position according to the movement of the host vehicle or the target.
- the projection unit 300 is a specific example of the “projection unit (projection means)”.
- FIG. 2 is a top view showing an example in which the host vehicle projects a projection pattern toward a pedestrian.
- FIG. 3 is a top view showing a reference example in which the host vehicle and the other vehicle project projection patterns toward the same pedestrian.
- the vehicle lighting system 10 in this embodiment projects light toward the area in front of the pedestrian 30 to draw a projection pattern 50 for alerting the pedestrian 30 .
- the projection pattern 50 is projected, for example, as a figure including the “exclamation mark” and the “arrow” as shown in the figure.
- the exclamation mark is generally recognized as a mark meaning “caution” or “danger.” For this reason, the pedestrian 30 who views the projection pattern 50 is expected to pay attention to the direction indicated by the arrow.
- Using the projection pattern 50 in this way makes it possible to appropriately alert the pedestrian 30 (i.e., target) that the host vehicle 20 is approaching.
- the projection pattern 50 a and the projection pattern 50 b may form a shape that is difficult to recognize or cannot be recognized. In such a case, it becomes difficult to appropriately alert the pedestrian 30 .
- the overlapping between the projection pattern 50 a and the projection pattern 50 b may result in forming a pattern that is recognized as a pattern for presenting incorrect information to the pedestrian 30 .
- the resulting pattern may be recognized as a pattern for prompting the pedestrian to approach the projection pattern 50 . In this case, the resulting pattern may increase danger to the pedestrian 30 rather than alert the pedestrian 30 .
- the vehicle lighting system 10 in this embodiment performs the operation that is described in detail below.
- FIG. 4 is a flowchart showing the flow of the operation of the vehicle lighting system in the first embodiment.
- the information acquisition unit 210 first acquires the host vehicle information detected by the sensor group 140 (step S 101 ). Next, the information acquisition unit 210 acquires the surrounding environment information detected by the in-vehicle camera 110 , the radar 120 , and the communication device 130 (step S 102 ). The processing in step S 101 and step S 102 may be performed in reverse order or concurrently. The host vehicle information and the surrounding environment information, acquired by the information acquisition unit 210 , are output to the projection target determination unit 220 .
- the projection target determination unit 220 determines whether there is a target, to which an alert should be sent, around the host vehicle 20 (step S 103 ). From the surrounding environment information, the projection target determination unit 220 detects a moving body that is likely to be a target and, based on the host vehicle information and the surrounding environment information, determines whether the detected moving body is a target.
- the target determined in this step includes not only a moving body to which an alert should be sent in relation to the host vehicle 20 but also a moving body to which an alert should be sent in relation to the other vehicle 40 . That is, if an alert should not be sent from the host vehicle 20 , but should be sent from the other vehicle 40 , to a moving body, the moving body is determined as a target.
- Whether or not the moving body is a target is determined using contact possibility or peripheral visibility calculated, for example, from the surrounding environment information.
- Contact possibility is a value indicating the possibility that the host vehicle 20 will contact the moving body or the possibility that the other vehicle 40 will contact the moving body.
- the contact possibility is calculated based on the distance or the relative speed between the host vehicle 20 or the other vehicle 40 and the moving body.
- Peripheral visibility is a value indicating the visibility around the moving body as viewed from the driver of the host vehicle 20 or the other vehicle 40 or the visibility in the direction of the host vehicle 20 or the other vehicle 40 as viewed from the moving body.
- the peripheral visibility is calculated based on the ambient brightness and whether or not there is an obstacle that reduces the visibility.
- FIG. 5 is a flowchart showing a target determination method using contact possibility.
- FIG. 6 is a flowchart showing a target determination method using peripheral visibility. The processing shown in FIG. 5 and FIG. 6 is the processing performed in step S 103 shown in FIG. 4 .
- a target is determined using contact possibility as shown in FIG. 5 .
- it is determined whether there is a possibility that the moving body will contact the host vehicle 20 (step S 103 a ). More specifically, the contact possibility between the moving body and the host vehicle 20 is calculated based on the distance and the relative speed between the moving body and the host vehicle 20 . Then, the calculated contact possibility is compared with the predetermined first threshold to determine whether there is a possibility that the moving body and the host vehicle 20 will contact with each other.
- step S 103 a YES
- step S 103 b it is determined whether there is a possibility that the moving body will contact the other vehicle 40. More specifically, the contact possibility between the moving body and the other vehicle 40 is calculated based on the distance and the relative speed between the moving body and the other vehicle 40 . Then, the calculated contact possibility is compared with the predetermined second threshold to determine whether there is a possibility that the moving body and the other vehicle 40 will contact with each other.
- step S 103 b When it is determined that there is a possibility that the moving body and the other vehicle 40 will contact (step S 103 b : YES), then it is determined that the moving body is a target and the processing proceeds to step S 104 . On the other hand, when it is determined that there is no possibility that the moving body and the other vehicle 40 will contact (step S 103 b : NO), then it is determined that the moving body is not a target and the processing proceeds to step S 105 .
- step S 103 c it is first determined, as shown in FIG. 6 , whether the visibility of the moving body as viewed from the host vehicle 20 is poor. More specifically, the peripheral visibility around the moving body as viewed from the host vehicle 20 is calculated based on the brightness, and on whether or not there is an obstacle, around the moving body. After that, the calculated peripheral visibility is compared with the predetermined third threshold to determine whether the peripheral visibility of the moving body as viewed from the host vehicle 20 is poor. When it is determined that the peripheral visibility of the moving body as viewed from the host vehicle 20 is poor (step S 103 c : YES), then it is determined that the moving body is a target and the processing proceeds to step S 104 .
- step S 103 d it is determined whether the visibility of the host vehicle 20 as viewed from the moving body is poor. More specifically, the peripheral visibility around the host vehicle 20 as viewed from the moving body is calculated based on the brightness, and on whether or not there is an obstacle, around the host vehicle 20 . After that, the calculated peripheral visibility is compared with the predetermined fourth threshold to determine whether the visibility of the host vehicle 20 as viewed from the moving body is poor.
- step S 103 d When it is determined that the visibility of the host vehicle 20 as viewed from the moving body is poor (step S 103 d : YES), then it is determined that the moving body is a target and the processing proceeds to step S 104 . On the other hand, when it is determined that the visibility of the host vehicle 20 as viewed from the moving body is not poor (step S 103 d : NO), then it is determined that the moving body is not a target and the processing proceeds to step S 105 .
- the peripheral visibility of the moving body as viewed from the host vehicle 20 and the peripheral visibility of the host vehicle 20 as viewed from the moving body may be determined.
- the peripheral visibility as well as the driving operation of the driver of the host vehicle 20 may be detected to determine whether the driver of the host vehicle 20 has recognized that there is the moving body.
- the peripheral visibility as well as the movement of the moving body may be detected to determine whether the moving body has recognized that there is the host vehicle 20 .
- step S 103 when the projection target determination unit 220 determines that the moving body is not a target (that is, an alert should not be sent) (step S 103 : NO), the host vehicle 20 does not project a projection light (step S 105 ) and a series of processing is terminated. That is, the host vehicle 20 does not project an alert using the projection pattern 50 .
- the other vehicle projection determination unit 230 determines whether the other vehicle 40 is already projecting the projection pattern 50 toward the target (step S 104 ).
- the projection pattern 50 projected by the other vehicle 40 is a specific example of the “second pattern”.
- step S 104 When it is determined that the other vehicle 40 is already projecting the projection pattern 50 toward the target (step S 104 : YES), the projection determination unit 240 determines that the projection pattern 50 should not be projected from the host vehicle 20 toward the target. Therefore, if the other vehicle 40 is already projecting the projection pattern 50 , the host vehicle 20 does not project the projection pattern 50 (step S 105 ).
- the projection determination unit 240 determines that the projection pattern 50 should be projected from the host vehicle 20 toward the target. Therefore, if the other vehicle is not projecting the projection pattern 50 , the projection pattern 50 is projected from the host vehicle 20 toward the target (step S 106 ).
- the projection pattern 50 projected by the host vehicle 20 is a specific example of the “first pattern”.
- the vehicle lighting system 10 in the first embodiment determines whether to project the projection pattern 50 from the host vehicle 20 depending upon whether or not the projection pattern 50 is being projected from the other vehicle 40 .
- the projection pattern 50 is not projected from the host vehicle 20 .
- an alert is sent only by the projection pattern 50 b projected by the other vehicle 40 .
- Projecting the projection pattern 50 in this way avoids a situation in which overlapping between the projection pattern 50 a , projected from the host vehicle 20 , and the projection pattern 50 b , projected from the other vehicle 40 , makes it impossible to send an appropriate alert to the target.
- the projection pattern 50 is projected from the host vehicle 20 .
- an alert is sent only by the projection pattern 50 a projected by the host vehicle 20 .
- Projecting the projection pattern 50 in this way also avoids a situation in which overlapping between the projection pattern 50 a , projected from the host vehicle 20 , and the projection pattern 50 b , projected from the other vehicle 40 , makes it impossible to send an appropriate alert to the target.
- the second embodiment has a configuration almost similar to that of the first embodiment already described above except only a part of the operation. Therefore, in the description below, only the part different from the first embodiment is described in detail and the duplicate description of the overlapping part is omitted as appropriate.
- FIG. 7 is a flowchart showing the flow of the operation of the vehicle lighting system in the second embodiment.
- the same reference numeral is given to the same processing as that of the first embodiment described above (see FIG. 4 ), and the description thereof is omitted as appropriate.
- the information acquisition unit 210 acquires the host vehicle information (step S 101 ).
- the information acquisition unit 210 acquires the surrounding environment information (step S 102 ).
- the projection target determination unit 220 determines whether there is a target, to which an alert should be sent, around the host vehicle 20 (step S 103 ).
- the host vehicle 20 does not project a projection light (step S 209 ), and a series of processing is terminated. That is, the host vehicle 20 does not send an alert using the projection pattern 50 .
- the other vehicle projection determination unit 230 determines whether the other vehicle 40 is already projecting the projection pattern toward the target (step S 204 ). When it is determined that the other vehicle 40 is not projecting the projection pattern (step S 204 : NO), the projection determination unit 240 performs the first projection control (step S 210 ).
- the first projection control is a control operation performed when the other vehicle 40 is not projecting the projection pattern 50 (in other words, a control operation performed when only the host vehicle 20 projects the projection pattern 50 ).
- FIG. 8 is a flowchart showing the processing flow of the first projection control.
- step S 301 when the first projection control is started, it is first determined whether the information (presentation information) to be presented to the target using the projection pattern relates to the other vehicle 40 (step S 301 ).
- the projection pattern 50 indicating the information on the host vehicle 20 is projected from the host vehicle 20 to the target (step S 302 ).
- the projection pattern 50 projected from the host vehicle 20 in this case is a specific example of the “first pattern”.
- step S 301 when it is determined that the presentation information relates to the other vehicle 40 (step S 301 : YES), then it is determined whether the presentation information relates to the other vehicle 40 as well as to the host vehicle 20 (step S 303 ). More specifically, it is determined whether the presentation information relates to both the host vehicle 20 and the other vehicle 40 or only to the other vehicle 40 .
- the projection pattern 50 which indicates the information on the host vehicle 20 and the other vehicle 40 , is projected from the host vehicle 20 toward the target (step S 304 ). That is, the host vehicle 20 presents not only the information on the host vehicle 20 but also the information on the other vehicle 40 to the target.
- the projection pattern 50 projected from the host vehicle 20 in this case is a specific example of the “fourth pattern”.
- FIG. 9 is a top view showing a projection example in which the information on the host vehicle and the other vehicle is projected.
- the projection pattern 50 which indicates the information on the host vehicle 20 and the other vehicle 40 , includes the two “arrows” in addition to the “exclamation mark”. These two arrows indicate the direction from which the host vehicle 20 is approaching and the direction from which the other vehicle 40 is approaching, respectively. Therefore, the pedestrian 30 , who sees the projection pattern 50 like this, is expected to pay attention to the directions indicated by the two arrows (that is, both directions, left and the right, from the pedestrian 30 ).
- the projection pattern 50 that indicates the information on the other vehicle 40 is projected from the host vehicle 20 toward the target (step S 305 ). That is, instead of the information on the host vehicle, the host vehicle 20 presents the information on the other vehicle 40 to the target.
- the projection pattern 50 projected from the host vehicle 20 in this case is a specific example of the “second pattern”.
- step S 304 and step S 305 described above the host vehicle 20 presents the information on the other vehicle 40 even when the other vehicle 40 does not present the information on itself. Therefore, an appropriate alert can be sent to the target, for example, even when the other vehicle 40 does not have a system similar to the vehicle lighting system 10 .
- the projection pattern 50 projected during the first projection control is projected when the other vehicle 40 is not projecting the projection pattern 50 . That is, only one projection pattern is projected. For this reason, the projection pattern 50 projected from the host vehicle 20 during the first projection control only needs to be projected at a position where the target can visually recognize the pattern. There is no need to consider the positional relation with another projection pattern as in the case of the second projection control that will be described later.
- the priority of the information to be presented by the host vehicle 20 and the priority of the information to be presented by the other vehicle 40 are compared (step S 205 ).
- the “priority” mentioned here is the parameter for determining to which information, the information to be presented by the host vehicle 20 or the information to be presented by the other vehicle 40 , the priority is to be given. For example, this parameter is set in advance according to the importance that is determined according to the type of information (more specifically, according to which information is linked to more directly danger).
- the unit that determines the priority of the information to be presented by the host vehicle 20 and the priority of the information to be presented by the other vehicle 40 is a specific example of the “third determination unit (third determination means)”. Further, the ECU 200 may include the unit as logical or physical processing blocks.
- the priority may be the parameter that varies depending not only upon the type of information but also upon the situation. More specifically, for the information to be presented to the pedestrian 30 closer to the other vehicle 40 rather than to the host vehicle 20 , it is only required that the priority of the information to be presented from the other vehicle 40 be set higher than the priority of the information to be presented from the host vehicle 20 . Conversely, for the information to be presented to the pedestrian 30 closer to the host vehicle 20 rather than to the other vehicle 40 , it is only required that the priority of the information to be presented from the host vehicle 20 be set higher than the priority of the information to be presented from the other vehicle 40 .
- a projection stop request is output from the host vehicle 20 to the other vehicle 40 . More specifically, a request for stopping the projection of the projection pattern 50 is output to the other vehicle 40 using vehicle-vehicle communication carried out by the communication device 130 .
- step S 208 After outputting the projection stop request, it is determined whether the other vehicle 40 has actually stopped the projection of the projection pattern 50 . This determination is made by carrying out vehicle-vehicle communication using the communication device 130 or by analyzing the image of data captured by the in-vehicle camera 110 .
- step S 210 the first projection control described above is performed (step S 210 ). That is, the projection control is performed on the premise that the other vehicle 40 is not projecting the projection pattern.
- step S 208 NO
- the projection stop request is output to the other vehicle 40 repeatedly (step S 207 ).
- the output of the projection stop request may be stopped after the request is output the number of times equal to or more than the predetermined number of times.
- the second projection control (step S 211 ), which will be described below, may be performed as an exceptional operation.
- the projection determination unit 240 performs the second projection control (step S 211 ).
- the second projection control is a control operation that is performed when the other vehicle 40 is already projecting the projection pattern 50 (in other words, a control operation performed when both the host vehicle 20 and the other vehicle 40 project the projection pattern 50 ).
- FIG. 10 is a flowchart showing the processing flow of the second projection control.
- step S 401 when the second projection control is started, it is determined whether the other vehicle 40 is presenting the information on the host vehicle 20 (that is, whether the information indicated by the projection pattern 50 , projected by the other vehicle 40 , includes the information on the host vehicle 20 ) (step S 401 ).
- step S 401 When it is determined that the other vehicle 40 is presenting the information on the host vehicle 20 (step S 401 : YES), the host vehicle 20 does not project the projection pattern 50 toward the target (step S 402 ). This is because there is no need to newly present the information on the host vehicle 20 since it is already included in the projection pattern 50 projected from the other vehicle 40 . However, when the information on the host vehicle 20 is insufficient, the host vehicle 20 may project the projection pattern 50 that compensates for the insufficient information.
- step S 401 when it is determined that the other vehicle 40 is not presenting the information on the host vehicle 20 (step S 401 : NO), then it is determined whether a new projection pattern 50 can be projected in an area other than the area in which the other vehicle 40 is currently projecting the projection pattern 50 (step S 403 ). More specifically, based on the surrounding environment information acquired by the information acquisition unit 210 , it is determined whether there is an area, large enough to project the new projection pattern 50 , within the visible range of the target.
- the host vehicle 20 projects the projection pattern at a non-overlapping position where the projection pattern 50 does not overlap with the projection pattern 50 already projected by the other vehicle 40 (step S 404 ).
- the “non-overlapping position” mentioned here means not only a position where the projection pattern 50 , projected by the other vehicle 40 , and the projection pattern 50 , projected by the host vehicle 20 , do not overlap at all but also a position where the overlapping portion, if any, is so small that there is little or no loss in what is meant by the projection pattern 50 .
- the “non-overlapping position” is assumed to be in a predetermined range around the target and is an area visible from the target.
- FIG. 11 is a top view showing an example in which the projection pattern is projected at a position not overlapping with the position of the projection pattern from the other vehicle.
- the other vehicle 40 is projecting the projection pattern 50 b (more specifically, the pattern including the “exclamation mark” and the “right arrow (viewed from the pedestrian)”) to inform the pedestrian 30 that the other vehicle 40 is approaching.
- the host vehicle 20 projects the projection pattern 50 a (more specifically, the pattern including the “exclamation mark” and the “left arrow (viewed from the pedestrian))”, next to the projection pattern 50 b projected by the other vehicle, to inform the pedestrian 30 that the host vehicle is approaching.
- the pedestrian 30 who views the projection pattern 50 a and the projection pattern 50 b is expected to pay attention to both the right direction and the left direction.
- the above-described projection mode is exemplary only, and the position of the projection pattern 50 a , projected by the host vehicle 20 , need not necessarily be next to the projection pattern 50 b projected by the other vehicle 40 . It is only required that the position of the projection pattern 50 a be a position where the projection pattern 50 a is visible from the pedestrian 30 and, in addition, the projection pattern 50 a does not overlap with the projection pattern 50 b projected by the other vehicle 40 . However, to give an appropriate alert to the pedestrian 30 , it is preferable that the position of the projection pattern 50 a projected by the host vehicle 20 be a position near from the projection pattern 50 b projected by the other vehicle 40 .
- step S 403 when it is determined that the projection pattern 50 cannot be projected in another area (step S 403 : NO), then it is determined whether the difference information can be projected additionally in the surrounding area of the projection pattern 50 projected by the other vehicle 40 (step S 405 ). When it is determined that the difference information can be projected additionally (step S 405 : YES), the host vehicle 20 projects the projection pattern 50 that indicates the difference information (step S 406 ).
- the “difference information” mentioned here refers to the difference between the information to be presented to the target by the host vehicle 20 and the information indicated by the projection pattern 50 projected by the other vehicle 40 .
- the difference information is calculated by the projection determination unit 240 . More specifically, the difference information can be calculated by removing the overlapping information, which overlaps with the information indicated by the projection pattern 50 projected by the other vehicle 40 , from the information to be presented from the host vehicle 20 .
- the information indicated by the projection pattern 50 projected by the other vehicle 40 may be acquired by vehicle-vehicle communication carried out by the communication device 130 or by analyzing the data captured by the in-vehicle camera 110 .
- the amount of the difference information is smaller than the amount of the information to be presented to the target by the host vehicle 20 . Therefore, the projection pattern 50 indicating the difference information can be simplified and reduced in size as compared with the independent projection pattern 50 (for example, see FIG. 11 ). This means that, even if the independent projection pattern 50 cannot be projected separately due to a shortage in the projection area, there is a possibility that the projection pattern 50 showing the difference information can be projected.
- step S 405 described above it is determined whether the area, where the projection pattern will be projected, is not large enough to project the independent projection pattern 50 but is large enough to project the projection pattern 50 indicating the difference information.
- FIG. 12 is a top view showing an example of projection in which the difference information indicating the difference from the projection pattern of the other vehicle is projected.
- the other vehicle 40 is projecting the projection pattern 50 b (more specifically, the pattern including the “exclamation mark” and the “right arrow (viewed from pedestrian)”) to inform the pedestrian 30 that the other vehicle 40 is approaching.
- the host vehicle 20 projects the projection pattern 50 a (more specifically, the pattern including the “left arrow (viewed from pedestrian)”) in the area immediately above the projection pattern 50 b , projected by the other vehicle, to inform the pedestrian 30 that the host vehicle 20 is approaching.
- the pedestrian 30 who views the projection pattern 50 a and the projection pattern 50 b is expected to pay attention to both the right direction and the left direction.
- the projection pattern 50 a mentioned here is a specific example of the “third pattern”.
- the information to be originally presented from the host vehicle 20 to the pedestrian 30 is the information indicating an alert to the pedestrian 30 (corresponding to the exclamation mark) and the information indicating the direction from which the host vehicle 20 is approaching (corresponding to the arrow) (for example, see FIG. 11 ).
- the information indicating an alert to the pedestrian 30 is already included in the projection pattern 50 b projected from the other vehicle 40 and there is no need to project the same information redundantly. Therefore, even when only the projection pattern 50 a , which includes only the arrow indicating the direction from which the host vehicle 20 is approaching, is projected, it is possible to present the information equivalent to the information indicated by the independent projection pattern 50 that is projected separately.
- step S 405 when it is determined that the difference information cannot be projected additionally (step S 405 : NO), the host vehicle 20 does not project the projection pattern 50 (step S 407 ). Doing so prevents the information, indicated by the projection pattern 50 b projected by the other vehicle 40 , from being blocked by the projection pattern 50 a to be projected by the host vehicle 20 .
- the vehicle lighting system 10 in the second embodiment determines whether the host vehicle 20 will project the projection pattern 50 , or in which the host vehicle 20 will project the projection pattern 50 , according to various conditions.
- the contents of the information to be presented by the projection pattern 50 , projected by the host vehicle 20 , and the projection position of the projection pattern 50 are changed, respectively, to the appropriate contents and the appropriate projection position.
- an alert can be given more appropriately in this embodiment than in the first embodiment in which the host vehicle 20 does not project the projection pattern 50 when the other vehicle 40 is already projecting the projection pattern 50 .
- the target to which an alert should be sent
- the target is not limited to the pedestrian 30 .
- the third vehicle may be the target.
Abstract
A vehicle lighting system includes a first detection unit that detects a moving body around a host vehicle, a first determination unit that determines whether the moving body is a target to which an alert should be sent, a projection unit that can project a first pattern, which indicates first information on the host vehicle, in a predetermined range around the target, a second determination unit that determines whether another vehicle is projecting a second pattern, which indicates second information on the other vehicle, and a control unit that controls the projection unit such that the first pattern is projected at the predetermined range around the target and the first pattern does not overlap with the second pattern or such that the first pattern is not projected, when the other vehicle is projecting the second pattern.
Description
- The disclosure of Japanese Patent Application No. 2016-043543 filed on Mar. 7, 2016 including the specification, drawings and abstract is incorporated herein by reference in its entirety.
- 1. Technical Field
- The present disclosure relates to the technical field of a vehicle lighting system that presents information to a target by projecting light around a host vehicle.
- 2. Description of Related Art
- As a system of this type, a technology is known that presents some information (for example, the information indicating that the host vehicle is approaching) to a target around the host vehicle by projecting light toward the target. Japanese Patent Application Publication No. 2008-143510 (JP 2008-143510 A) proposes a technology that detects a target around a vehicle and alerts the target by projecting the projection light of a predetermined pattern toward that target. Japanese Patent Application Publication No. 2009-282564 (JP 2009-282564 A) proposes a technology that expands the projection range of the headlight further forward than usual when it is determined that a vehicle is likely to collide with a pedestrian.
- According to the technology proposed in Japanese Patent Application Publication No. 2008-143510 (JP 2008-143510 A) described above, there is a possibility that, when there is a plurality of vehicles equipped with a similar system, a plurality of projection lights is projected toward the same target. More specifically, there may be a situation in which the host vehicle projects an alert pattern to a target and, at the same time, another vehicle projects an alert pattern to that same target.
- In such a case, when the projection positions of projection light sometimes overlap with each other, the shape of the pattern is lost and, as a result, there is a possibility that a target cannot recognize the pattern, indicated by the projection light, or recognizes the pattern as another pattern indicating incorrect information. Therefore, this technology has a technical problem that an appropriate alert cannot be sent to a target due to a projection light from another vehicle.
- The present disclosure provides a vehicle lighting system that can present information appropriately to a target even when there is a plurality of vehicles capable of presenting information to the target by a projection light.
- A vehicle lighting system in an aspect of the present disclosure includes a first detection unit configured to detect a moving body around a host vehicle, a first determination unit configured to determine whether the moving body is a target to which an alert should be sent, a projection unit capable of projecting a first pattern, which indicates first information on the host vehicle, in a predetermined range around the target when the first determination unit determines that the moving body is the target, a second determination unit configured to determine whether another vehicle is projecting a second pattern, which indicates second information on the other vehicle, in the predetermined range around the target, and a control unit configured to control the projection unit such that the first pattern is projected at a position where the predetermined range around the target and the first pattern does not overlap with the second pattern or such that the first pattern is not projected, when the second determination unit determines that the other vehicle is projecting the second pattern.
- According to the vehicle lighting system of the present disclosure, either the first pattern projected from the host vehicle is projected at a position where the predetermined range around the target and the first pattern does not overlap with the second pattern or the first pattern is not projected, when the other vehicle is projecting the second pattern. Therefore, the vehicle lighting system of the present disclosure reduces the possibility that overlapping between the first pattern, projected by the host vehicle, and the second pattern, projected by the other vehicle, prevents appropriate information from being presented to the target.
- In the aspect described above, the control unit may be configured to control the projection unit such that the first pattern is projected at the predetermined range around the target when the second determination unit determines that the other vehicle is not projecting the second pattern.
- According to this configuration, the first information on the host vehicle can be presented to the target.
- In the aspect described above, the vehicle lighting system may further include a difference calculation unit configured to calculate difference information that is included in the first information but is not included in the second information. The projection unit can project a third pattern, which indicates the difference information, in addition to the first pattern, and the control unit may be configured to control the projection unit such that, instead of the first pattern, the third pattern is projected at the predetermined range around the target and the third pattern does not overlap with the second pattern, when the second determination unit determines that the other vehicle is projecting the second pattern.
- According to this configuration, the difference information can be presented to the target without interfering with the projection of the second pattern.
- In the aspect described above, each of the first information and the second information has information indicating priority, and the vehicle lighting system may further include a third determination unit that determines the priority of the first information and the second information. The control unit may be configured to control the projection unit such that a request is sent to the other vehicle to stop the projection of the second pattern and, after that, the first pattern is projected, when the second determination unit determines that the other vehicle is projecting the second pattern and that the third determination unit determines that the priority of the first information is higher than the priority of the second information.
- According to this configuration, higher priority information can be presented to the target.
- In the aspect described above, the projection unit can project the second pattern and a fourth pattern, which indicates both the first information and the second information, in addition to the first pattern, and the control unit may be configured to control the projection unit such that the second pattern or the fourth pattern is projected instead of the first pattern, when the second determination unit determines that the other vehicle is not projecting the second pattern.
- According to this configuration, the second information on the other vehicle can be presented from the host vehicle by projecting the second pattern or the fourth pattern.
- In the aspect described above, the first determination unit may be configured to determine whether the moving body is the target, based on contact possibility between the host vehicle and the moving body or peripheral visibility of the moving body.
-
- In the aspect described above, the first determination unit may be configured to determine whether the moving body is the target, based on contact possibility between the other vehicle and the moving body or the peripheral visibility of the moving body.
- In the aspect described above, the second determination unit may be configured to determine whether the other vehicle is projecting the second pattern based on an analysis result of captured image data.
- The operation and effect of the present disclosure will become apparent from the embodiments described below.
- Features, advantages, and technical and industrial significance of exemplary embodiments of the disclosure will be described below with reference to the accompanying drawings, in which like numerals denote like elements, and wherein:
-
FIG. 1 is a block diagram showing a configuration of a vehicle lighting system in a first embodiment; -
FIG. 2 is a top view showing an example in which a host vehicle projects a projection pattern toward a pedestrian; -
FIG. 3 is a top view showing a reference example in which the host vehicle and another vehicle project projection patterns toward the same pedestrian; -
FIG. 4 is a flowchart showing the flow of the operation of the vehicle lighting system in the first embodiment; -
FIG. 5 is a flowchart showing a target determination method using contact possibility; -
FIG. 6 is a flowchart showing a target determination method using peripheral visibility; -
FIG. 7 is a flowchart showing the flow of the operation of a vehicle lighting system in a second embodiment; -
FIG. 8 a flowchart showing the processing flow of first projection control; -
FIG. 9 is a top view showing an example in which information on the host vehicle and the other vehicle is projected; -
FIG. 10 a flowchart showing the processing flow of second projection control; -
FIG. 11 is a top view showing an example in which a projection pattern is projected at a position not overlapping with the position of a projection pattern from the other vehicle; and -
FIG. 12 is a top view showing an example in which the difference information indicating the difference from the projection pattern of the other vehicle is projected. - Embodiments of a vehicle lighting system of the present disclosure are described below with reference to the drawings. In the description below, the two embodiments, first embodiment and second embodiment, are described.
- A vehicle lighting system in the first embodiment is described with reference to
FIG. 1 toFIG. 6 . In the description below, the configuration of the vehicle lighting system, the problems that may be generated when using a projection pattern, the operation of the vehicle lighting system, and the technical effects achieved by the vehicle lighting system are described sequentially. - First, the configuration of the vehicle lighting system in the first embodiment is described with reference to
FIG. 1 .FIG. 1 is a block diagram showing the configuration of the vehicle lighting system in the first embodiment. - In
FIG. 1 , avehicle lighting system 10 in the first embodiment, mounted on a vehicle such as an automobile, is configured to be able to project a predetermined projection pattern on the road surface by projecting light. Thevehicle lighting system 10 includes an in-vehicle camera 110, aradar 120, acommunication device 130, asensor group 140, an electronic control unit (ECU) 200, and aprojection unit 300. - The in-
vehicle camera 110 is a camera capable of capturing the area ahead of the vehicle (in other words, the area corresponding to the field of view of the driver). The in-vehicle camera 110 may be a camera that captures an image using visible light or a camera that captures images using non-visible light (for example, infrared light). An image captured by the in-vehicle camera 110 is output to aninformation acquisition unit 210 as image data. - The
radar 120 is a radar capable of recognizing an object that is present in the area ahead of the vehicle. Theradar 120 is configured to be able to detect the position and the moving speed of an object around the vehicle by using millimeter waves or a laser. The information on an object detected by theradar 120 is output to theinformation acquisition unit 210. - The
communication device 130 is configured to be able to receive information on the surrounding environment of the vehicle by radio communication. More specifically, thecommunication device 130 carries out vehicle-vehicle communication, road-vehicle communication, or pedestrian-vehicle communication to receive information on the other vehicles and pedestrians. Thecommunication device 130 may also be configured to be able to send information on the host vehicle. The information received by thecommunication device 130 is output to theinformation acquisition unit 210. - The
sensor group 140 includes a plurality of sensors capable of detecting the status of the host vehicle. For example, thesensor group 140 includes a vehicle speed sensor, an acceleration sensor, a yaw sensor, and the like. The information indicating the status of the host vehicle, detected by thesensor group 140, is output to theinformation acquisition unit 210. - The
ECU 200, a controller unit having an arithmetic circuit such as a central processing unit (CPU), is configured to be able to control various operations in the vehicle. In particular, theECU 200 in this embodiment is configured to be able to perform control for drawing a projection pattern that will be described later. TheECU 200 includes the following units as logical or physical processing blocks that are realized therein: theinformation acquisition unit 210, a projectiontarget determination unit 220, an other vehicleprojection determination unit 230, and aprojection determination unit 240. - The
information acquisition unit 210 is configured to be able to acquire the information, output from each of the in-vehicle camera 110, theradar 120, and thecommunication device 130, as the surrounding environment information (that is, the information indicating the surrounding environment of the vehicle, in particular, the information on the objects around the vehicle). The surrounding environment information acquired by theinformation acquisition unit 210 is output to the projectiontarget determination unit 220. Theinformation acquisition unit 210, together with the in-vehicle camera 110, theradar 120, and thecommunication device 130, functions as a specific example of the “detection unit (detection means)”. - In addition, the
information acquisition unit 210 is configured to be able to acquire the information, output from thesensor group 140, as host vehicle information (that is, the information indicating the status of the host vehicle). The host vehicle information, acquired by theinformation acquisition unit 210, is output to the projectiontarget determination unit 220 together with the surrounding environment information. - The projection
target determination unit 220 is configured to be able to detect moving bodies, which are present around the vehicle, using the surrounding environment information received from theinformation acquisition unit 210. The “moving body” mentioned here means not only an object that is actually moving but also an object that is likely to move. For example, pedestrians who are stopped and other vehicles that are stopped may be detected as moving bodies. - In addition, the projection
target determination unit 220 is configured to be able to determine whether the detected moving body is a target. The “target” mentioned here is a target to which an alert should be sent using a projection pattern. For example, a pedestrian who is crossing in front of the host vehicle on the road, on which the host vehicle is traveling, is a target. The information on the target determined by the projection target determination unit 220 (for example, information indicating the attribute and the position of the target, the movement direction, the movement speed, etc.) is output to the other vehicleprojection determination unit 230. The projectiontarget determination unit 220 is a specific example of the “first determination unit (first determination means)”. - The other vehicle
projection determination unit 230 is configured to be able to determine whether the other vehicle is projecting a projection pattern to the target determined by the projectiontarget determination unit 220. More specifically, the other vehicleprojection determination unit 230 determines whether the other vehicle is projecting a projection pattern, based on the analysis result of the image data captured by the in-vehicle camera 110 or based on the vehicle-vehicle communication carried out by thecommunication device 130. In addition, the other vehicleprojection determination unit 230 may be configured to be able to determine not only whether the other vehicle is projecting a projection pattern but also what type of information the projected projection pattern presents. The determination result of the other vehicleprojection determination unit 230 is output to theprojection determination unit 240. The other vehicleprojection determination unit 230 is a specific example of the “second determination unit (second determination means)”. - The
projection determination unit 240 is configured to be able to determine whether to project a projection pattern from the host vehicle, based on the determination result of the other vehicleprojection determination unit 230. In addition, theprojection determination unit 240 may be configured to be able to determine not only whether to project a projection pattern from the host vehicle but also the specific projection mode of the projection pattern. The specific determination processing performed by theprojection determination unit 240 is described in detail in the description of the operation that will be given later. The determination result of theprojection determination unit 240 is output to theprojection unit 300. Theprojection determination unit 240 is a specific example of the “control unit (control means)”. - The
projection unit 300 includes a light (for example, the headlight of the vehicle) capable of changing the direction, and the pattern, of light to be projected. Theprojection unit 300 is configured to be able to project a predetermined projection pattern in a predetermined range around a target based on the determination result of theprojection determination unit 240. The “predetermined range” mentioned here is the range in which a projection pattern, drawn on the road surface by the light from theprojection unit 300, is clear enough to be recognized by the target. That is, the “predetermined range” is the range in which the target can visually recognize the predetermined projection pattern. Theprojection unit 300 may have a function to change the projection position to an appropriate position according to the movement of the host vehicle or the target. Theprojection unit 300 is a specific example of the “projection unit (projection means)”. - Next, the problem that may be generated when using a projection pattern is described in detail with reference to
FIG. 2 andFIG. 3 .FIG. 2 is a top view showing an example in which the host vehicle projects a projection pattern toward a pedestrian.FIG. 3 is a top view showing a reference example in which the host vehicle and the other vehicle project projection patterns toward the same pedestrian. - In the example shown in
FIG. 2 , there is apedestrian 30 who is going to cross the roadway in front of ahost vehicle 20. If thispedestrian 30 continues to cross the roadway, thehost vehicle 20 and thepedestrian 30 may approach and collide. In such a case, thevehicle lighting system 10 in this embodiment projects light toward the area in front of thepedestrian 30 to draw aprojection pattern 50 for alerting thepedestrian 30. - The
projection pattern 50 is projected, for example, as a figure including the “exclamation mark” and the “arrow” as shown in the figure. The exclamation mark is generally recognized as a mark meaning “caution” or “danger.” For this reason, thepedestrian 30 who views theprojection pattern 50 is expected to pay attention to the direction indicated by the arrow. Using theprojection pattern 50 in this way makes it possible to appropriately alert the pedestrian 30 (i.e., target) that thehost vehicle 20 is approaching. - In the reference example in
FIG. 3 , there is thepedestrian 30 who is going to cross the roadway and anothervehicle 40 that is traveling in the oncoming lane. In this case, if theother vehicle 40 has thevehicle lighting system 10 similar to that of thehost vehicle 20, it is assumed that theother vehicle 40 also projects theprojection pattern 50 b to alert thepedestrian 30. - However, if a
projection pattern 50 a projected by thehost vehicle 20 and aprojection pattern 50 b projected by theother vehicle 40 overlap, theprojection pattern 50 a and theprojection pattern 50 b may form a shape that is difficult to recognize or cannot be recognized. In such a case, it becomes difficult to appropriately alert thepedestrian 30. In addition, the overlapping between theprojection pattern 50 a and theprojection pattern 50 b may result in forming a pattern that is recognized as a pattern for presenting incorrect information to thepedestrian 30. For example, the resulting pattern may be recognized as a pattern for prompting the pedestrian to approach theprojection pattern 50. In this case, the resulting pattern may increase danger to thepedestrian 30 rather than alert thepedestrian 30. - As described above, when there is a plurality of vehicles capable of projecting the
projection pattern 50, a plurality ofprojection patterns 50 is projected toward the same target, sometimes resulting in a situation in which it is impossible to appropriately alert the target. To avoid such a situation, thevehicle lighting system 10 in this embodiment performs the operation that is described in detail below. - The operation of the
vehicle lighting system 10 in the first embodiment is described in detail below with reference toFIG. 4 .FIG. 4 is a flowchart showing the flow of the operation of the vehicle lighting system in the first embodiment. - In
FIG. 4 , when thevehicle lighting system 10 in this embodiment starts operation, theinformation acquisition unit 210 first acquires the host vehicle information detected by the sensor group 140 (step S101). Next, theinformation acquisition unit 210 acquires the surrounding environment information detected by the in-vehicle camera 110, theradar 120, and the communication device 130 (step S102). The processing in step S101 and step S102 may be performed in reverse order or concurrently. The host vehicle information and the surrounding environment information, acquired by theinformation acquisition unit 210, are output to the projectiontarget determination unit 220. - Next, the projection
target determination unit 220 determines whether there is a target, to which an alert should be sent, around the host vehicle 20 (step S103). From the surrounding environment information, the projectiontarget determination unit 220 detects a moving body that is likely to be a target and, based on the host vehicle information and the surrounding environment information, determines whether the detected moving body is a target. The target determined in this step includes not only a moving body to which an alert should be sent in relation to thehost vehicle 20 but also a moving body to which an alert should be sent in relation to theother vehicle 40. That is, if an alert should not be sent from thehost vehicle 20, but should be sent from theother vehicle 40, to a moving body, the moving body is determined as a target. - Whether or not the moving body is a target is determined using contact possibility or peripheral visibility calculated, for example, from the surrounding environment information. “Contact possibility” is a value indicating the possibility that the
host vehicle 20 will contact the moving body or the possibility that theother vehicle 40 will contact the moving body. For example, the contact possibility is calculated based on the distance or the relative speed between thehost vehicle 20 or theother vehicle 40 and the moving body. “Peripheral visibility” is a value indicating the visibility around the moving body as viewed from the driver of thehost vehicle 20 or theother vehicle 40 or the visibility in the direction of thehost vehicle 20 or theother vehicle 40 as viewed from the moving body. For example, the peripheral visibility is calculated based on the ambient brightness and whether or not there is an obstacle that reduces the visibility. - The target determination flow using contact possibility and peripheral visibility is described in detail below with reference to
FIG. 5 andFIG. 6 .FIG. 5 is a flowchart showing a target determination method using contact possibility.FIG. 6 is a flowchart showing a target determination method using peripheral visibility. The processing shown inFIG. 5 andFIG. 6 is the processing performed in step S103 shown inFIG. 4 . - A target is determined using contact possibility as shown in
FIG. 5 . First, it is determined whether there is a possibility that the moving body will contact the host vehicle 20 (step S103 a). More specifically, the contact possibility between the moving body and thehost vehicle 20 is calculated based on the distance and the relative speed between the moving body and thehost vehicle 20. Then, the calculated contact possibility is compared with the predetermined first threshold to determine whether there is a possibility that the moving body and thehost vehicle 20 will contact with each other. When it is determined that there is a possibility that the moving body and thehost vehicle 20 will contact (step S103 a: YES), then it is determined that the moving body is a target and the processing proceeds to step S104. - On the other hand, when it is determined that there is no possibility that the moving body will contact the host vehicle 20 (step S103 a: NO), then it is determined whether there is a possibility that the moving body will contact the other vehicle 40 (step S103 b). More specifically, the contact possibility between the moving body and the
other vehicle 40 is calculated based on the distance and the relative speed between the moving body and theother vehicle 40. Then, the calculated contact possibility is compared with the predetermined second threshold to determine whether there is a possibility that the moving body and theother vehicle 40 will contact with each other. When it is determined that there is a possibility that the moving body and theother vehicle 40 will contact (step S103 b: YES), then it is determined that the moving body is a target and the processing proceeds to step S104. On the other hand, when it is determined that there is no possibility that the moving body and theother vehicle 40 will contact (step S103 b: NO), then it is determined that the moving body is not a target and the processing proceeds to step S105. - To determine whether the moving body is a target using peripheral visibility, it is first determined, as shown in
FIG. 6 , whether the visibility of the moving body as viewed from thehost vehicle 20 is poor (step S103 c). More specifically, the peripheral visibility around the moving body as viewed from thehost vehicle 20 is calculated based on the brightness, and on whether or not there is an obstacle, around the moving body. After that, the calculated peripheral visibility is compared with the predetermined third threshold to determine whether the peripheral visibility of the moving body as viewed from thehost vehicle 20 is poor. When it is determined that the peripheral visibility of the moving body as viewed from thehost vehicle 20 is poor (step S103 c: YES), then it is determined that the moving body is a target and the processing proceeds to step S104. - On the other hand, when it is determined that the visibility of the moving body as viewed from the
host vehicle 20 is not poor (step S103 c: NO), then it is determined whether the visibility of thehost vehicle 20 as viewed from the moving body is poor (step S103 d). More specifically, the peripheral visibility around thehost vehicle 20 as viewed from the moving body is calculated based on the brightness, and on whether or not there is an obstacle, around thehost vehicle 20. After that, the calculated peripheral visibility is compared with the predetermined fourth threshold to determine whether the visibility of thehost vehicle 20 as viewed from the moving body is poor. When it is determined that the visibility of thehost vehicle 20 as viewed from the moving body is poor (step S103 d: YES), then it is determined that the moving body is a target and the processing proceeds to step S104. On the other hand, when it is determined that the visibility of thehost vehicle 20 as viewed from the moving body is not poor (step S103 d: NO), then it is determined that the moving body is not a target and the processing proceeds to step S105. - In addition to, or instead of, the peripheral visibility of the moving body as viewed from the
host vehicle 20 and the peripheral visibility of thehost vehicle 20 as viewed from the moving body, the peripheral visibility of the moving body as viewed from theother vehicle 40 and the peripheral visibility of theother vehicle 40 as viewed from the moving body may be determined. In addition, the peripheral visibility as well as the driving operation of the driver of thehost vehicle 20 may be detected to determine whether the driver of thehost vehicle 20 has recognized that there is the moving body. Similarly, the peripheral visibility as well as the movement of the moving body may be detected to determine whether the moving body has recognized that there is thehost vehicle 20. - Returning to
FIG. 4 , when the projectiontarget determination unit 220 determines that the moving body is not a target (that is, an alert should not be sent) (step S103: NO), thehost vehicle 20 does not project a projection light (step S105) and a series of processing is terminated. That is, thehost vehicle 20 does not project an alert using theprojection pattern 50. - On the other hand, when the projection
target determination unit 220 determines that the moving body is a target (that is, an alert should be sent) (step S103: YES), the other vehicleprojection determination unit 230 determines whether theother vehicle 40 is already projecting theprojection pattern 50 toward the target (step S104). Theprojection pattern 50 projected by theother vehicle 40 is a specific example of the “second pattern”. - When it is determined that the
other vehicle 40 is already projecting theprojection pattern 50 toward the target (step S104: YES), theprojection determination unit 240 determines that theprojection pattern 50 should not be projected from thehost vehicle 20 toward the target. Therefore, if theother vehicle 40 is already projecting theprojection pattern 50, thehost vehicle 20 does not project the projection pattern 50 (step S105). - On the other hand, when it is determined that the
other vehicle 40 is not projecting theprojection pattern 50 toward the target (step S104: NO), theprojection determination unit 240 determines that theprojection pattern 50 should be projected from thehost vehicle 20 toward the target. Therefore, if the other vehicle is not projecting theprojection pattern 50, theprojection pattern 50 is projected from thehost vehicle 20 toward the target (step S106). Theprojection pattern 50 projected by thehost vehicle 20 is a specific example of the “first pattern”. - Next, the technical effect achieved by the
vehicle lighting system 10 in the first embodiment is described below. - As described with reference to
FIG. 1 toFIG. 6 , when there is a target (i.e., thepedestrian 30 or the like) around thehost vehicle 20 to which an alert should be sent, thevehicle lighting system 10 in the first embodiment determines whether to project theprojection pattern 50 from thehost vehicle 20 depending upon whether or not theprojection pattern 50 is being projected from theother vehicle 40. - Therefore, when the
other vehicle 40 is already projecting theprojection pattern 50, theprojection pattern 50 is not projected from thehost vehicle 20. In other words, when theother vehicle 40 is already projecting theprojection pattern 50, an alert is sent only by theprojection pattern 50 b projected by theother vehicle 40. Projecting theprojection pattern 50 in this way avoids a situation in which overlapping between theprojection pattern 50 a, projected from thehost vehicle 20, and theprojection pattern 50 b, projected from theother vehicle 40, makes it impossible to send an appropriate alert to the target. - On the other hand, when the
other vehicle 40 is not projecting theprojection pattern 50 to the target, theprojection pattern 50 is projected from thehost vehicle 20. In other words, when theother vehicle 40 is not projecting theprojection pattern 50, an alert is sent only by theprojection pattern 50 a projected by thehost vehicle 20. Projecting theprojection pattern 50 in this way also avoids a situation in which overlapping between theprojection pattern 50 a, projected from thehost vehicle 20, and theprojection pattern 50 b, projected from theother vehicle 40, makes it impossible to send an appropriate alert to the target. - Next, a vehicle lighting system in a second embodiment is described with reference to
FIG. 7 toFIG. 12 . The second embodiment has a configuration almost similar to that of the first embodiment already described above except only a part of the operation. Therefore, in the description below, only the part different from the first embodiment is described in detail and the duplicate description of the overlapping part is omitted as appropriate. - In the following, the description of the configuration of the vehicle lighting system is omitted, and the operation of the vehicle lighting system and the technical effect achieved by the vehicle lighting system are described sequentially.
- The operation of the
vehicle lighting system 10 in the second embodiment is described in detail with reference toFIG. 7 .FIG. 7 is a flowchart showing the flow of the operation of the vehicle lighting system in the second embodiment. InFIG. 7 , the same reference numeral is given to the same processing as that of the first embodiment described above (seeFIG. 4 ), and the description thereof is omitted as appropriate. - In
FIG. 7 , when thevehicle lighting system 10 in the second embodiment starts operation, the same processing as that in steps S101 to S103 in the first embodiment is performed first. That is, theinformation acquisition unit 210 acquires the host vehicle information (step S101). Next, theinformation acquisition unit 210 acquires the surrounding environment information (step S102). After that, the projectiontarget determination unit 220 determines whether there is a target, to which an alert should be sent, around the host vehicle 20 (step S103). When it is determined that there is no target, to which an alert should be sent, around the host vehicle 20 (step S103: NO), thehost vehicle 20 does not project a projection light (step S209), and a series of processing is terminated. That is, thehost vehicle 20 does not send an alert using theprojection pattern 50. - On the other hand, when it is determined that there is a target (step S103: YES), the other vehicle
projection determination unit 230 determines whether theother vehicle 40 is already projecting the projection pattern toward the target (step S204). When it is determined that theother vehicle 40 is not projecting the projection pattern (step S204: NO), theprojection determination unit 240 performs the first projection control (step S210). The first projection control is a control operation performed when theother vehicle 40 is not projecting the projection pattern 50 (in other words, a control operation performed when only thehost vehicle 20 projects the projection pattern 50). - In the following, the first projection control is described in detail with reference to
FIG. 8 .FIG. 8 is a flowchart showing the processing flow of the first projection control. - In
FIG. 8 , when the first projection control is started, it is first determined whether the information (presentation information) to be presented to the target using the projection pattern relates to the other vehicle 40 (step S301). - When it is determined that the presentation information does not relate to the other vehicle 40 (in other words, the presentation information relates to the host vehicle 20) (step S301: NO), the
projection pattern 50 indicating the information on thehost vehicle 20 is projected from thehost vehicle 20 to the target (step S302). Theprojection pattern 50 projected from thehost vehicle 20 in this case is a specific example of the “first pattern”. - On the other hand, when it is determined that the presentation information relates to the other vehicle 40 (step S301: YES), then it is determined whether the presentation information relates to the
other vehicle 40 as well as to the host vehicle 20 (step S303). More specifically, it is determined whether the presentation information relates to both thehost vehicle 20 and theother vehicle 40 or only to theother vehicle 40. - When it is determined that the presentation information relates also to the host vehicle 20 (step S303: YES), the
projection pattern 50, which indicates the information on thehost vehicle 20 and theother vehicle 40, is projected from thehost vehicle 20 toward the target (step S304). That is, thehost vehicle 20 presents not only the information on thehost vehicle 20 but also the information on theother vehicle 40 to the target. Theprojection pattern 50 projected from thehost vehicle 20 in this case is a specific example of the “fourth pattern”. - The
projection pattern 50 that indicates the information on thehost vehicle 20 and theother vehicle 40 is described below in detail with reference toFIG. 9 .FIG. 9 is a top view showing a projection example in which the information on the host vehicle and the other vehicle is projected. - As shown in
FIG. 9 , theprojection pattern 50, which indicates the information on thehost vehicle 20 and theother vehicle 40, includes the two “arrows” in addition to the “exclamation mark”. These two arrows indicate the direction from which thehost vehicle 20 is approaching and the direction from which theother vehicle 40 is approaching, respectively. Therefore, thepedestrian 30, who sees theprojection pattern 50 like this, is expected to pay attention to the directions indicated by the two arrows (that is, both directions, left and the right, from the pedestrian 30). - Returning to
FIG. 8 , when it is determined that the presentation information does not relate to the host vehicle 20 (in other words, the presentation information relates only to the other vehicle 40) (step S303: NO), theprojection pattern 50 that indicates the information on theother vehicle 40 is projected from thehost vehicle 20 toward the target (step S305). That is, instead of the information on the host vehicle, thehost vehicle 20 presents the information on theother vehicle 40 to the target. Theprojection pattern 50 projected from thehost vehicle 20 in this case is a specific example of the “second pattern”. - In the projection in step S304 and step S305 described above, the
host vehicle 20 presents the information on theother vehicle 40 even when theother vehicle 40 does not present the information on itself. Therefore, an appropriate alert can be sent to the target, for example, even when theother vehicle 40 does not have a system similar to thevehicle lighting system 10. - The
projection pattern 50 projected during the first projection control is projected when theother vehicle 40 is not projecting theprojection pattern 50. That is, only one projection pattern is projected. For this reason, theprojection pattern 50 projected from thehost vehicle 20 during the first projection control only needs to be projected at a position where the target can visually recognize the pattern. There is no need to consider the positional relation with another projection pattern as in the case of the second projection control that will be described later. - Returning to
FIG. 7 , when it is determined that theother vehicle 40 is projecting the projection pattern (step S204: YES), the priority of the information to be presented by thehost vehicle 20 and the priority of the information to be presented by theother vehicle 40 are compared (step S205). The “priority” mentioned here is the parameter for determining to which information, the information to be presented by thehost vehicle 20 or the information to be presented by theother vehicle 40, the priority is to be given. For example, this parameter is set in advance according to the importance that is determined according to the type of information (more specifically, according to which information is linked to more directly danger). The unit that determines the priority of the information to be presented by thehost vehicle 20 and the priority of the information to be presented by theother vehicle 40 is a specific example of the “third determination unit (third determination means)”. Further, theECU 200 may include the unit as logical or physical processing blocks. - In addition, the priority may be the parameter that varies depending not only upon the type of information but also upon the situation. More specifically, for the information to be presented to the
pedestrian 30 closer to theother vehicle 40 rather than to thehost vehicle 20, it is only required that the priority of the information to be presented from theother vehicle 40 be set higher than the priority of the information to be presented from thehost vehicle 20. Conversely, for the information to be presented to thepedestrian 30 closer to thehost vehicle 20 rather than to theother vehicle 40, it is only required that the priority of the information to be presented from thehost vehicle 20 be set higher than the priority of the information to be presented from theother vehicle 40. - When it is determined as a result of the priority comparison that the priority of the information to be presented by the
host vehicle 20 is higher than the priority of the information to be presented by the other vehicle 40 (step S206: YES), a projection stop request is output from thehost vehicle 20 to theother vehicle 40. More specifically, a request for stopping the projection of theprojection pattern 50 is output to theother vehicle 40 using vehicle-vehicle communication carried out by thecommunication device 130. - After outputting the projection stop request, it is determined whether the
other vehicle 40 has actually stopped the projection of theprojection pattern 50. This determination is made by carrying out vehicle-vehicle communication using thecommunication device 130 or by analyzing the image of data captured by the in-vehicle camera 110. When theother vehicle 40 has stopped the projection of the projection pattern 50 (step S208: YES), the first projection control described above is performed (step S210). That is, the projection control is performed on the premise that theother vehicle 40 is not projecting the projection pattern. - When the projection stop request is output but the projection from the
other vehicle 40 is not stopped (step S208: NO), the projection stop request is output to theother vehicle 40 repeatedly (step S207). However, because there is a possibility that theother vehicle 40 does not have a unit for receiving the projection stop request, the output of the projection stop request may be stopped after the request is output the number of times equal to or more than the predetermined number of times. In this case, the second projection control (step S211), which will be described below, may be performed as an exceptional operation. - On the other hand, when it is determined as a result of priority comparison that the priority of the information to be presented by the
host vehicle 20 is equal to or lower than the priority of the information to be presented by the other vehicle 40 (step S206: NO), theprojection determination unit 240 performs the second projection control (step S211). The second projection control is a control operation that is performed when theother vehicle 40 is already projecting the projection pattern 50 (in other words, a control operation performed when both thehost vehicle 20 and theother vehicle 40 project the projection pattern 50). - The second projection control is described in detail below with reference to
FIG. 10 .FIG. 10 is a flowchart showing the processing flow of the second projection control. - In
FIG. 10 , when the second projection control is started, it is determined whether theother vehicle 40 is presenting the information on the host vehicle 20 (that is, whether the information indicated by theprojection pattern 50, projected by theother vehicle 40, includes the information on the host vehicle 20) (step S401). - When it is determined that the
other vehicle 40 is presenting the information on the host vehicle 20 (step S401: YES), thehost vehicle 20 does not project theprojection pattern 50 toward the target (step S402). This is because there is no need to newly present the information on thehost vehicle 20 since it is already included in theprojection pattern 50 projected from theother vehicle 40. However, when the information on thehost vehicle 20 is insufficient, thehost vehicle 20 may project theprojection pattern 50 that compensates for the insufficient information. - On the other hand, when it is determined that the
other vehicle 40 is not presenting the information on the host vehicle 20 (step S401: NO), then it is determined whether anew projection pattern 50 can be projected in an area other than the area in which theother vehicle 40 is currently projecting the projection pattern 50 (step S403). More specifically, based on the surrounding environment information acquired by theinformation acquisition unit 210, it is determined whether there is an area, large enough to project thenew projection pattern 50, within the visible range of the target. - When it is determined that the
projection pattern 50 can be projected in the other area (step S403: YES), thehost vehicle 20 projects the projection pattern at a non-overlapping position where theprojection pattern 50 does not overlap with theprojection pattern 50 already projected by the other vehicle 40 (step S404). The “non-overlapping position” mentioned here means not only a position where theprojection pattern 50, projected by theother vehicle 40, and theprojection pattern 50, projected by thehost vehicle 20, do not overlap at all but also a position where the overlapping portion, if any, is so small that there is little or no loss in what is meant by theprojection pattern 50. The “non-overlapping position” is assumed to be in a predetermined range around the target and is an area visible from the target. - Projection at a non-overlapping position, where the
projection pattern 50 from thehost vehicle 20 do not overlap with theprojection pattern 50 from theother vehicle 40, is described more in detail below with reference toFIG. 11 .FIG. 11 is a top view showing an example in which the projection pattern is projected at a position not overlapping with the position of the projection pattern from the other vehicle. - It is assumed that, as shown in
FIG. 11 , theother vehicle 40 is projecting theprojection pattern 50 b (more specifically, the pattern including the “exclamation mark” and the “right arrow (viewed from the pedestrian)”) to inform thepedestrian 30 that theother vehicle 40 is approaching. In such a case, thehost vehicle 20 projects theprojection pattern 50 a (more specifically, the pattern including the “exclamation mark” and the “left arrow (viewed from the pedestrian))”, next to theprojection pattern 50 b projected by the other vehicle, to inform thepedestrian 30 that the host vehicle is approaching. Thepedestrian 30 who views theprojection pattern 50 a and theprojection pattern 50 b is expected to pay attention to both the right direction and the left direction. - The above-described projection mode is exemplary only, and the position of the
projection pattern 50 a, projected by thehost vehicle 20, need not necessarily be next to theprojection pattern 50 b projected by theother vehicle 40. It is only required that the position of theprojection pattern 50 a be a position where theprojection pattern 50 a is visible from thepedestrian 30 and, in addition, theprojection pattern 50 a does not overlap with theprojection pattern 50 b projected by theother vehicle 40. However, to give an appropriate alert to thepedestrian 30, it is preferable that the position of theprojection pattern 50 a projected by thehost vehicle 20 be a position near from theprojection pattern 50 b projected by theother vehicle 40. - Returning to
FIG. 10 , when it is determined that theprojection pattern 50 cannot be projected in another area (step S403: NO), then it is determined whether the difference information can be projected additionally in the surrounding area of theprojection pattern 50 projected by the other vehicle 40 (step S405). When it is determined that the difference information can be projected additionally (step S405: YES), thehost vehicle 20 projects theprojection pattern 50 that indicates the difference information (step S406). - The “difference information” mentioned here refers to the difference between the information to be presented to the target by the
host vehicle 20 and the information indicated by theprojection pattern 50 projected by theother vehicle 40. The difference information is calculated by theprojection determination unit 240. More specifically, the difference information can be calculated by removing the overlapping information, which overlaps with the information indicated by theprojection pattern 50 projected by theother vehicle 40, from the information to be presented from thehost vehicle 20. In this case, the information indicated by theprojection pattern 50 projected by theother vehicle 40 may be acquired by vehicle-vehicle communication carried out by thecommunication device 130 or by analyzing the data captured by the in-vehicle camera 110. - The amount of the difference information is smaller than the amount of the information to be presented to the target by the
host vehicle 20. Therefore, theprojection pattern 50 indicating the difference information can be simplified and reduced in size as compared with the independent projection pattern 50 (for example, seeFIG. 11 ). This means that, even if theindependent projection pattern 50 cannot be projected separately due to a shortage in the projection area, there is a possibility that theprojection pattern 50 showing the difference information can be projected. In step S405 described above, it is determined whether the area, where the projection pattern will be projected, is not large enough to project theindependent projection pattern 50 but is large enough to project theprojection pattern 50 indicating the difference information. - The projection of the difference information is described in detail below with reference to
FIG. 12 .FIG. 12 is a top view showing an example of projection in which the difference information indicating the difference from the projection pattern of the other vehicle is projected. - It is assumed, as shown in
FIG. 12 , that theother vehicle 40 is projecting theprojection pattern 50 b (more specifically, the pattern including the “exclamation mark” and the “right arrow (viewed from pedestrian)”) to inform thepedestrian 30 that theother vehicle 40 is approaching. In such a case, thehost vehicle 20 projects theprojection pattern 50 a (more specifically, the pattern including the “left arrow (viewed from pedestrian)”) in the area immediately above theprojection pattern 50 b, projected by the other vehicle, to inform thepedestrian 30 that thehost vehicle 20 is approaching. Thepedestrian 30 who views theprojection pattern 50 a and theprojection pattern 50 b is expected to pay attention to both the right direction and the left direction. Theprojection pattern 50 a mentioned here is a specific example of the “third pattern”. - In the situation described above, the information to be originally presented from the
host vehicle 20 to thepedestrian 30 is the information indicating an alert to the pedestrian 30 (corresponding to the exclamation mark) and the information indicating the direction from which thehost vehicle 20 is approaching (corresponding to the arrow) (for example, seeFIG. 11 ). However, the information indicating an alert to the pedestrian 30 (corresponding to the exclamation mark) is already included in theprojection pattern 50 b projected from theother vehicle 40 and there is no need to project the same information redundantly. Therefore, even when only theprojection pattern 50 a, which includes only the arrow indicating the direction from which thehost vehicle 20 is approaching, is projected, it is possible to present the information equivalent to the information indicated by theindependent projection pattern 50 that is projected separately. - Returning to
FIG. 10 again, when it is determined that the difference information cannot be projected additionally (step S405: NO), thehost vehicle 20 does not project the projection pattern 50 (step S407). Doing so prevents the information, indicated by theprojection pattern 50 b projected by theother vehicle 40, from being blocked by theprojection pattern 50 a to be projected by thehost vehicle 20. - The technical effects achieved by the
vehicle lighting system 10 in the second embodiment is described below. As described with reference toFIG. 7 toFIG. 12 , when there is a target around the host vehicle 20 (i.e., the pedestrian 30) to which an alert should be sent, thevehicle lighting system 10 in the second embodiment determines whether thehost vehicle 20 will project theprojection pattern 50, or in which thehost vehicle 20 will project theprojection pattern 50, according to various conditions. - More specifically, according to the priority of the information indicated by the
projection patterns 50 projected by thehost vehicle 20 and theother vehicle 40 and according to the contents of the information indicated by the projection pattern projected by theother vehicle 40, the contents of the information to be presented by theprojection pattern 50, projected by thehost vehicle 20, and the projection position of theprojection pattern 50 are changed, respectively, to the appropriate contents and the appropriate projection position. Thus, an alert can be given more appropriately in this embodiment than in the first embodiment in which thehost vehicle 20 does not project theprojection pattern 50 when theother vehicle 40 is already projecting theprojection pattern 50. - Although the examples have been described in which the target, to which an alert should be sent, is the
pedestrian 30 in the first embodiment and the second embodiment described above, the target is not limited to thepedestrian 30. For example, when there is a third vehicle to which an alert should be sent from thehost vehicle 20 and theother vehicle 40, the third vehicle may be the target. - It is to be understood that the present disclosure is not limited to the embodiments described above but may be changed as appropriate within the scope of claims and within the spirit and the concept of the present disclosure understood from this specification and that a vehicle lighting system, to which such changes are added, is also included in the technical scope of the present disclosure.
Claims (9)
1. A vehicle lighting system comprising:
a first detection unit configured to detect a moving body around a host vehicle;
a first determination unit configured to determine whether the moving body is a target to which an alert should be sent;
a projection unit capable of projecting a first pattern in a predetermined range around the target when the first determination unit determines that the moving body is the target, the first pattern indicating first information on the host vehicle;
a second determination unit configured to determine whether another vehicle is projecting a second pattern in the predetermined range around the target, the second pattern indicating second information on the other vehicle; and
a control unit configured to control the projection unit such that the first pattern is projected at a position where the predetermined range around the target and the first pattern does not overlap with the second pattern or such that the first pattern is not projected, when the second determination unit determines that the other vehicle is projecting the second pattern.
2. The vehicle lighting system according to claim 1 , wherein
the control unit is configured to control the projection unit such that the first pattern is projected at the predetermined range around the target when the second determination unit determines that the other vehicle is not projecting the second pattern.
3. The vehicle lighting system according to claim 1 , further comprising,
a difference calculation unit configured to calculate difference information that is included in the first information but is not included in the second information, wherein
the projection unit is can project a third pattern in addition to the first pattern, the third pattern indicating the difference information, and
the control unit is configured to control the projection unit such that, instead of the first pattern, the third pattern is projected at the predetermined range around the target and the third pattern does not overlap with the second pattern, when the second determination unit determines that the other vehicle is projecting the second pattern.
4. The vehicle lighting system according to claim 1 , further comprising,
a third determination unit that determines priority of the first information and the second information, each of the first information and the second information having information indicating the priority, wherein
the control unit is configured to control the projection unit such that a request is sent to the other vehicle to stop the projection of the second pattern and, after that, the first pattern is projected, when the second determination unit determines that the other vehicle is projecting the second pattern and that the third determination unit determines that the priority of the first information is higher than the priority of the second information.
5. The vehicle lighting system according to claim 1 , wherein
the projection unit can project the second pattern and a fourth pattern in addition to the first pattern, the fourth pattern indicating both the first information and the second information, and
the control unit is configured to control the projection unit such that the second pattern or the fourth pattern is projected instead of the first pattern, when the second determination unit determines that the other vehicle is not projecting the second pattern.
6. The vehicle lighting system according to claim 1 , wherein
the first determination unit is configured to determine whether the moving body is the target, based on contact possibility between the host vehicle and the moving body or peripheral visibility of the moving body.
7. The vehicle lighting system according to claim 6 , wherein
the first determination unit is configured to determine whether the moving body is the target based on contact possibility between the other vehicle and the moving body or the peripheral visibility of the moving body.
8. The vehicle lighting system according to claim 1 , wherein
the second determination unit is configured to determine whether the other vehicle is projecting the second pattern based on an analysis result of captured image data.
9. A vehicle lighting system comprising,
a projection unit configured to project a first pattern, the first pattern indicating first information on a host vehicle; and
an electronic control unit configured to
detect a moving body around the host vehicle,
determine whether the moving body is a target to which an alert should be sent,
project the first pattern in a predetermined range around the target by the projection unit when it is determined that the moving body is the target,
determine whether another vehicle is projecting a second pattern in a predetermined range around the target, the second pattern indicating second information on the other vehicle; and
control the projection unit such that the first pattern is projected at a position where the predetermined range around the target and the first pattern does not overlap with the second pattern or such that the first pattern is not projected, when it is determined that the other vehicle is projecting the second pattern.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016043543A JP6500814B2 (en) | 2016-03-07 | 2016-03-07 | Vehicle lighting system |
JP2016-043543 | 2016-03-07 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170253177A1 true US20170253177A1 (en) | 2017-09-07 |
US9987979B2 US9987979B2 (en) | 2018-06-05 |
Family
ID=58227950
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/444,988 Active US9987979B2 (en) | 2016-03-07 | 2017-02-28 | Vehicle lighting system |
Country Status (4)
Country | Link |
---|---|
US (1) | US9987979B2 (en) |
EP (1) | EP3217377B1 (en) |
JP (1) | JP6500814B2 (en) |
CN (1) | CN107161076B (en) |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180261081A1 (en) * | 2017-03-10 | 2018-09-13 | Subaru Corporation | Image display device |
US20180257550A1 (en) * | 2017-03-10 | 2018-09-13 | Subaru Corporation | Image display device |
US20180319325A1 (en) * | 2015-10-27 | 2018-11-08 | Koito Manufacturing Co., Ltd. | Vehicular illumination device, vehicle system, and vehicle |
US10232713B2 (en) * | 2017-04-12 | 2019-03-19 | Lg Electronics Inc. | Lamp for a vehicle |
US10252721B1 (en) * | 2017-11-27 | 2019-04-09 | Honda Motor Co., Ltd. | System and method for providing a vehicle convoy status indication |
US10300846B2 (en) | 2017-03-10 | 2019-05-28 | Subaru Corporation | Image display apparatus |
US10308172B2 (en) * | 2017-03-10 | 2019-06-04 | Subaru Corporation | Image display device |
US10325488B2 (en) * | 2017-03-10 | 2019-06-18 | Subaru Corporation | Image display device |
US10358083B2 (en) | 2017-03-10 | 2019-07-23 | Subaru Corporation | Image display device |
US20190392711A1 (en) * | 2017-03-09 | 2019-12-26 | Bayerische Motoren Werke Aktiengesellschaft | Motor Vehicle Comprising a Lighting Module for Generating a Set of Symbols |
US10558416B2 (en) | 2017-03-10 | 2020-02-11 | Subaru Corporation | Image display device |
CN110834583A (en) * | 2018-08-15 | 2020-02-25 | 株式会社小糸制作所 | Display system for vehicle and vehicle |
US10627819B1 (en) * | 2018-10-11 | 2020-04-21 | Pony Ai Inc. | On-site notification from autonomous vehicle for traffic safety |
US20200298846A1 (en) * | 2016-12-13 | 2020-09-24 | Hyundai Motor Company | Apparatus for preventing pedestrian collision accident, system having the same, and method thereof |
WO2020189636A1 (en) * | 2019-03-20 | 2020-09-24 | Ricoh Company, Ltd. | Information providing system, moving body, information providing method, and information providing program |
CN112119433A (en) * | 2018-05-24 | 2020-12-22 | 罗伯特·博世有限公司 | Safeguarding and taking safety measures against dangerous points by vehicle warning |
CN112208539A (en) * | 2019-07-09 | 2021-01-12 | 奥迪股份公司 | System, vehicle, method, and medium for autonomous driving of a vehicle |
CN112889097A (en) * | 2018-10-17 | 2021-06-01 | 戴姆勒股份公司 | Road crossing channel visualization method |
US20210206314A1 (en) * | 2016-11-18 | 2021-07-08 | Panasonic Intellectual Property Management Co., Ltd. | Notifying device and notifying system |
US11084418B2 (en) * | 2019-04-10 | 2021-08-10 | Hyundai Motor Company | Apparatus and method for outputting platooning information in vehicle |
US11318879B2 (en) * | 2017-09-15 | 2022-05-03 | Mitsubishi Electric Corporation | Irradiation apparatus and irradiation method |
US11396262B2 (en) * | 2018-02-13 | 2022-07-26 | Honda Motor Co., Ltd. | Saddle type vehicle |
EP3974254A4 (en) * | 2019-07-15 | 2022-09-07 | Great Wall Motor Company Limited | Method for using vehicle light to project pattern, vehicle light system, and vehicle |
US20220381415A1 (en) * | 2020-02-17 | 2022-12-01 | Koito Manufacturing Co., Ltd. | Lamp system |
US11532232B2 (en) * | 2019-11-01 | 2022-12-20 | Lg Electronics Inc. | Vehicle having dangerous situation notification function and control method thereof |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR3041110B1 (en) * | 2015-09-14 | 2018-03-16 | Valeo Vision | PROJECTION METHOD FOR A MOTOR VEHICLE OF AN IMAGE ON A PROJECTION SURFACE |
JP6554131B2 (en) * | 2017-03-15 | 2019-07-31 | 株式会社Subaru | Vehicle display system and method for controlling vehicle display system |
JP6981174B2 (en) * | 2017-10-25 | 2021-12-15 | トヨタ自動車株式会社 | Vehicle headlight device |
JP2020015471A (en) * | 2018-07-27 | 2020-01-30 | パイオニア株式会社 | Projection control device, projection control method, projection control program, and storage medium |
CN210191316U (en) * | 2018-08-06 | 2020-03-27 | 株式会社小糸制作所 | Display system for vehicle and vehicle |
FR3086901B1 (en) * | 2018-10-01 | 2020-11-13 | Valeo Vision | PROCESS FOR DRIVING PIXELLIZED LIGHT BEAM PROJECTION MODULES FOR VEHICLES |
JP7271941B2 (en) * | 2018-12-25 | 2023-05-12 | 市光工業株式会社 | Vehicle display system |
FR3097820B1 (en) * | 2019-06-25 | 2021-08-20 | Valeo Vision | METHOD OF PREVENTING A COLLISION OF A MOTOR VEHICLE WITH AN OBJECT IMPLEMENTING A LIGHT MODULE |
FR3128915A1 (en) * | 2021-11-05 | 2023-05-12 | Valeo Vision | Method for optimizing the lighting of a crossing zone between a plurality of vehicles emitting a light beam |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6733134B2 (en) * | 2000-11-09 | 2004-05-11 | Astron Group Technologies Sa | Visual signalling device adaptable to a vehicle |
US7175321B1 (en) * | 2004-03-24 | 2007-02-13 | Lopez Gustavo M | Projector systems |
US20070053195A1 (en) * | 2005-09-08 | 2007-03-08 | K.W. Muth Company, Inc. | Visual warning device |
US20080036576A1 (en) * | 2006-05-31 | 2008-02-14 | Mobileye Technologies Ltd. | Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications |
US20090013922A1 (en) * | 2007-07-13 | 2009-01-15 | Yung-Fa Lin | Automobile projected light for life safety |
US20100253541A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Traffic infrastructure indicator on head-up display |
US20100253539A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Vehicle-to-vehicle communicator on full-windshield head-up display |
US20100253594A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Peripheral salient feature enhancement on full-windshield head-up display |
US20100289632A1 (en) * | 2009-05-18 | 2010-11-18 | Gm Global Technology Operations, Inc. | Night vision on full windshield head-up display |
US20100292886A1 (en) * | 2009-05-18 | 2010-11-18 | Gm Global Technology Operations, Inc. | Turn by turn graphical navigation on full windshield head-up display |
US20120194356A1 (en) * | 2011-01-29 | 2012-08-02 | Realxperience, Llc | System that warns in advance of occupants exiting or entering a parked vehicle |
US20130154815A1 (en) * | 2011-12-14 | 2013-06-20 | Hyundai Motor Company | System and method of providing warning to pedestrian using laser beam |
US9221509B2 (en) * | 2013-01-21 | 2015-12-29 | Darfon Electronics (Suzhou) Co., Ltd. | Display apparatus and vehicle having a projector device |
US20160207443A1 (en) * | 2013-09-03 | 2016-07-21 | Jaguar Land Rover Limited | System for imaging |
US20160291149A1 (en) * | 2015-04-06 | 2016-10-06 | GM Global Technology Operations LLC | Fusion method for cross traffic application using radars and camera |
US9481287B2 (en) * | 2014-01-21 | 2016-11-01 | Harman International Industries, Inc. | Roadway projection system |
US20160321924A1 (en) * | 2015-05-01 | 2016-11-03 | Hyundai America Technical Center, Inc. | Predictive road hazard identification system |
US20160330417A1 (en) * | 2015-05-06 | 2016-11-10 | Conary Enterprise Co., Ltd. | Real-time data display device for bicycles |
US20170190334A1 (en) * | 2016-01-06 | 2017-07-06 | GM Global Technology Operations LLC | Prediction of driver intent at intersection |
US20170190336A1 (en) * | 2016-01-04 | 2017-07-06 | Delphi Technologies, Inc. | Automated Vehicle Operation Based On Gesture To Pedestrian |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4161584B2 (en) * | 2002-02-07 | 2008-10-08 | トヨタ自動車株式会社 | Safety device for moving objects |
JP5262057B2 (en) | 2006-11-17 | 2013-08-14 | 株式会社豊田中央研究所 | Irradiation device |
JP4675395B2 (en) * | 2008-05-19 | 2011-04-20 | 三菱電機株式会社 | Vehicle alarm device |
JP2010277123A (en) * | 2009-05-26 | 2010-12-09 | Mazda Motor Corp | Driving support system for vehicle |
US8823552B1 (en) * | 2013-04-04 | 2014-09-02 | GM Global Technology Operations LLC | Vehicle with apparatus for generating and displaying a predefined light pattern |
JP6485234B2 (en) * | 2015-06-09 | 2019-03-20 | 株式会社デンソー | Road surface display device |
-
2016
- 2016-03-07 JP JP2016043543A patent/JP6500814B2/en active Active
-
2017
- 2017-02-24 CN CN201710103871.XA patent/CN107161076B/en active Active
- 2017-02-28 US US15/444,988 patent/US9987979B2/en active Active
- 2017-03-02 EP EP17158938.5A patent/EP3217377B1/en active Active
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6733134B2 (en) * | 2000-11-09 | 2004-05-11 | Astron Group Technologies Sa | Visual signalling device adaptable to a vehicle |
US7175321B1 (en) * | 2004-03-24 | 2007-02-13 | Lopez Gustavo M | Projector systems |
US20070053195A1 (en) * | 2005-09-08 | 2007-03-08 | K.W. Muth Company, Inc. | Visual warning device |
US20080036576A1 (en) * | 2006-05-31 | 2008-02-14 | Mobileye Technologies Ltd. | Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications |
US20090013922A1 (en) * | 2007-07-13 | 2009-01-15 | Yung-Fa Lin | Automobile projected light for life safety |
US20100253541A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Traffic infrastructure indicator on head-up display |
US20100253539A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Vehicle-to-vehicle communicator on full-windshield head-up display |
US20100253594A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Peripheral salient feature enhancement on full-windshield head-up display |
US20100289632A1 (en) * | 2009-05-18 | 2010-11-18 | Gm Global Technology Operations, Inc. | Night vision on full windshield head-up display |
US20100292886A1 (en) * | 2009-05-18 | 2010-11-18 | Gm Global Technology Operations, Inc. | Turn by turn graphical navigation on full windshield head-up display |
US20120194356A1 (en) * | 2011-01-29 | 2012-08-02 | Realxperience, Llc | System that warns in advance of occupants exiting or entering a parked vehicle |
US20130154815A1 (en) * | 2011-12-14 | 2013-06-20 | Hyundai Motor Company | System and method of providing warning to pedestrian using laser beam |
US9221509B2 (en) * | 2013-01-21 | 2015-12-29 | Darfon Electronics (Suzhou) Co., Ltd. | Display apparatus and vehicle having a projector device |
US20160207443A1 (en) * | 2013-09-03 | 2016-07-21 | Jaguar Land Rover Limited | System for imaging |
US9481287B2 (en) * | 2014-01-21 | 2016-11-01 | Harman International Industries, Inc. | Roadway projection system |
US20160291149A1 (en) * | 2015-04-06 | 2016-10-06 | GM Global Technology Operations LLC | Fusion method for cross traffic application using radars and camera |
US20160321924A1 (en) * | 2015-05-01 | 2016-11-03 | Hyundai America Technical Center, Inc. | Predictive road hazard identification system |
US20160330417A1 (en) * | 2015-05-06 | 2016-11-10 | Conary Enterprise Co., Ltd. | Real-time data display device for bicycles |
US20170190336A1 (en) * | 2016-01-04 | 2017-07-06 | Delphi Technologies, Inc. | Automated Vehicle Operation Based On Gesture To Pedestrian |
US20170190334A1 (en) * | 2016-01-06 | 2017-07-06 | GM Global Technology Operations LLC | Prediction of driver intent at intersection |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180319325A1 (en) * | 2015-10-27 | 2018-11-08 | Koito Manufacturing Co., Ltd. | Vehicular illumination device, vehicle system, and vehicle |
US11066009B2 (en) * | 2015-10-27 | 2021-07-20 | Koito Manufacturing Co., Ltd. | Vehicular illumination device, vehicle system, and vehicle |
US20210206314A1 (en) * | 2016-11-18 | 2021-07-08 | Panasonic Intellectual Property Management Co., Ltd. | Notifying device and notifying system |
US11810452B2 (en) * | 2016-11-18 | 2023-11-07 | Panasonic Intellectual Property Management Co., Ltd. | Notifying device and notifying system |
US20200298846A1 (en) * | 2016-12-13 | 2020-09-24 | Hyundai Motor Company | Apparatus for preventing pedestrian collision accident, system having the same, and method thereof |
US20190392711A1 (en) * | 2017-03-09 | 2019-12-26 | Bayerische Motoren Werke Aktiengesellschaft | Motor Vehicle Comprising a Lighting Module for Generating a Set of Symbols |
US11238737B2 (en) * | 2017-03-09 | 2022-02-01 | Bayerische Motoren Werke Aktiengesellschaft | Motor vehicle comprising a lighting module for generating a set of symbols |
US10272830B2 (en) * | 2017-03-10 | 2019-04-30 | Subaru Corporation | Image display device |
US10300846B2 (en) | 2017-03-10 | 2019-05-28 | Subaru Corporation | Image display apparatus |
US10325488B2 (en) * | 2017-03-10 | 2019-06-18 | Subaru Corporation | Image display device |
US10358083B2 (en) | 2017-03-10 | 2019-07-23 | Subaru Corporation | Image display device |
US10311718B2 (en) | 2017-03-10 | 2019-06-04 | Subaru Corporation | Image display device for displaying images on a road surface |
US10558416B2 (en) | 2017-03-10 | 2020-02-11 | Subaru Corporation | Image display device |
US20180257550A1 (en) * | 2017-03-10 | 2018-09-13 | Subaru Corporation | Image display device |
US10308172B2 (en) * | 2017-03-10 | 2019-06-04 | Subaru Corporation | Image display device |
US20180261081A1 (en) * | 2017-03-10 | 2018-09-13 | Subaru Corporation | Image display device |
US10232713B2 (en) * | 2017-04-12 | 2019-03-19 | Lg Electronics Inc. | Lamp for a vehicle |
US11318879B2 (en) * | 2017-09-15 | 2022-05-03 | Mitsubishi Electric Corporation | Irradiation apparatus and irradiation method |
US10252721B1 (en) * | 2017-11-27 | 2019-04-09 | Honda Motor Co., Ltd. | System and method for providing a vehicle convoy status indication |
US11396262B2 (en) * | 2018-02-13 | 2022-07-26 | Honda Motor Co., Ltd. | Saddle type vehicle |
CN112119433A (en) * | 2018-05-24 | 2020-12-22 | 罗伯特·博世有限公司 | Safeguarding and taking safety measures against dangerous points by vehicle warning |
CN110834583A (en) * | 2018-08-15 | 2020-02-25 | 株式会社小糸制作所 | Display system for vehicle and vehicle |
US11221625B2 (en) * | 2018-10-11 | 2022-01-11 | Pony Ai Inc. | On-site notification from autonomous vehicle for traffic safety |
US10627819B1 (en) * | 2018-10-11 | 2020-04-21 | Pony Ai Inc. | On-site notification from autonomous vehicle for traffic safety |
CN112889097A (en) * | 2018-10-17 | 2021-06-01 | 戴姆勒股份公司 | Road crossing channel visualization method |
JP2020152246A (en) * | 2019-03-20 | 2020-09-24 | 株式会社リコー | Information provision system, movable body, information provision method, and information provision program |
WO2020189636A1 (en) * | 2019-03-20 | 2020-09-24 | Ricoh Company, Ltd. | Information providing system, moving body, information providing method, and information providing program |
US11084418B2 (en) * | 2019-04-10 | 2021-08-10 | Hyundai Motor Company | Apparatus and method for outputting platooning information in vehicle |
CN112208539A (en) * | 2019-07-09 | 2021-01-12 | 奥迪股份公司 | System, vehicle, method, and medium for autonomous driving of a vehicle |
EP3974254A4 (en) * | 2019-07-15 | 2022-09-07 | Great Wall Motor Company Limited | Method for using vehicle light to project pattern, vehicle light system, and vehicle |
US11532232B2 (en) * | 2019-11-01 | 2022-12-20 | Lg Electronics Inc. | Vehicle having dangerous situation notification function and control method thereof |
US20220381415A1 (en) * | 2020-02-17 | 2022-12-01 | Koito Manufacturing Co., Ltd. | Lamp system |
Also Published As
Publication number | Publication date |
---|---|
EP3217377A1 (en) | 2017-09-13 |
CN107161076A (en) | 2017-09-15 |
JP2017159699A (en) | 2017-09-14 |
CN107161076B (en) | 2019-09-27 |
JP6500814B2 (en) | 2019-04-17 |
US9987979B2 (en) | 2018-06-05 |
EP3217377B1 (en) | 2019-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9987979B2 (en) | Vehicle lighting system | |
US9583003B2 (en) | Vehicle danger notification control apparatus | |
US9785042B2 (en) | Vehicular lighting apparatus | |
CN109204311B (en) | Automobile speed control method and device | |
US20160210855A1 (en) | Method and traffic monitoring device for detecting a wrong-way driving incidnet of a motor vehicle | |
CN107953882B (en) | Driving assistance apparatus and driving assistance method using front monitoring apparatus | |
JP4885793B2 (en) | Obstacle monitoring device | |
JP2009236623A (en) | Object detector, periphery monitoring apparatus, driving support system, and object detection method | |
CN106257556B (en) | Detecting and communicating lane splitting maneuvers | |
CN115071702A (en) | Vehicle control device, vehicle control method, and computer program for vehicle control | |
US11361687B2 (en) | Advertisement display device, vehicle, and advertisement display method | |
JP5936258B2 (en) | Driving assistance device | |
US10759329B2 (en) | Out-of-vehicle notification device | |
US20240054886A1 (en) | Control method for visually marking a pedestrian crossing, marking device, and system | |
JP5416193B2 (en) | Obstacle monitoring device | |
JP2015090546A (en) | Outside-vehicle environment recognition device | |
WO2015190052A1 (en) | Preceding condition determination apparatus | |
US20220309924A1 (en) | Vehicle control device, vehicle, operation method for vehicle control device, and storage medium | |
US20210094467A1 (en) | Automated driving enabled vehicle | |
JP2008225578A (en) | Vehicle monitoring device | |
WO2021094800A1 (en) | Traffic signal recognition method and traffic signal recognition device | |
EP3865815A1 (en) | Vehicle-mounted system | |
JP2017144820A (en) | Illuminating system for vehicle | |
EP3992048A1 (en) | Image processing device, imaging device, mobile body, and image processing method | |
JP7247252B2 (en) | Application program, information provision method, and terminal device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KAWAMATA, SHINYA;NAGATA, SHINICHI;FUJITA, KAZUYUKI;AND OTHERS;SIGNING DATES FROM 20170110 TO 20170113;REEL/FRAME:041833/0727 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |