WO2009070069A1 - A system for classifying objects in the vicinity of a vehicle - Google Patents

A system for classifying objects in the vicinity of a vehicle Download PDF

Info

Publication number
WO2009070069A1
WO2009070069A1 PCT/SE2007/050901 SE2007050901W WO2009070069A1 WO 2009070069 A1 WO2009070069 A1 WO 2009070069A1 SE 2007050901 W SE2007050901 W SE 2007050901W WO 2009070069 A1 WO2009070069 A1 WO 2009070069A1
Authority
WO
WIPO (PCT)
Prior art keywords
classifier
data
camera
reflected radiation
vehicle
Prior art date
Application number
PCT/SE2007/050901
Other languages
French (fr)
Inventor
Ognjan Hedberg
Jonas HAMMARSTRÖM
Original Assignee
Autoliv Development Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autoliv Development Ab filed Critical Autoliv Development Ab
Priority to PCT/SE2007/050901 priority Critical patent/WO2009070069A1/en
Priority to EP07852173A priority patent/EP2212160A4/en
Publication of WO2009070069A1 publication Critical patent/WO2009070069A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • G06F18/256Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R2021/0002Type of accident

Definitions

  • a system for classifying objects in the vicinity of a vehicle A system for classifying objects in the vicinity of a vehicle.
  • THIS INVENTION relates to a classification system, and in particular concerns a system for classifying objects in the vicinity of a vehicle.
  • Modern motor vehicles are typically equipped with several different safety systems, which are adapted to protect both occupants of the vehicle (in the case of internal air- bags or seal belt pret ⁇ nsioners), or pedestrians (for example, bonnet lifters and external air-bag.
  • one or more of these safety systems may be deployed, or a safety system may be deployed in one of a plurality of possible modes, depending in part upon the type of the other object that is involved. For instance, an impact with a pole or tree at a given speed may be more severe than in impact with another vehicle. If it appears that the vehicle is about to strike a pedestrian, then an external air-bag or bonnet lifter may be activated, but if the vehicle is about to strike an inanimate object such as a tree then there is no need for these protection systems to be deployed.
  • Existing object classifiers comprise computer programs which are operable to analyse data from a vehicle sensor, such as a camera or radar system.
  • the classifier is "trained '5 with exposure to many different types of object in different circumstances, so that the program is able to make an accurate determination as to the type of a new kind of object that is detected,
  • one aspect of the present invention provides a classification system for classifying objects in the vicinity of a vehicle, the system comprising: a video camera for gathering camera data; a reflected radiation system for gathering reflected radiation data; and a classifier, wherein raw data from the camera and the reflected radiation system are combined and analysed by the classifier, the classifier being configured to provide an output relating to the type of an object that appears in data gathered by both the camera and the reflected radiation system,
  • the determination comprises an indication that the object is a certain type of object.
  • the determination comprises a probability that the object is of a certain type
  • the classifier comprises a neural network, a support vector machine or an adaptive boosting classifier.
  • the classifier is trained to discriminate between different types of object.
  • the classifier is trained to discriminate between pedestrians and other types of object.
  • the classifier is trained to discriminate between pedestrians, vehicles and pole-like objects.
  • the output of the classifier is used to control the deployment of a pedestrian protection device
  • the output of the classifier is used to control the deployment of both a pedestrian protection device and an occupant protection device.
  • the data gathered by the camera and the reflected radiation system are input into a pre-selector., which is configured to analyse the data from the camera and the reflected radiation system and to identify regions of the data which contain an object of potential interest.
  • the data from the camera and/or the reflected radiation system is analysed by an impact evaluator, which evaluates the risk of an impact between the vehicle and the object of potential interest.
  • outputs from both the classifier and the impact evaluator are used to control the deployment of one or more protection devices.
  • the protection devices are pedestrian protection devices and/or occupant protection devices.
  • the reflected radiation system is a radar system.
  • Figures 1 and 4 show schematic views of the components of a classification system embodying the present invention.
  • FIGS 2 and 3 show schematic views of data gathered by different on-board vehicle sensors.
  • FIG I 5 components of a system embodying the present invention are shown.
  • a vehicle camera 1 and a radar system 2 are both provided, being mounted on a vehicle so that the fields of view of the camera 1 and radar system 2 are overlapping,
  • the camera 1 may be a "mono" camera, may comprise a stereo camera system, may comprise an infrared (IR) camera or be any other suitable kind of camera that forms an image from received light, which need not fall within the visible portion of the spectrum.
  • IR infrared
  • the radar system 2 may be a conventional radar system, or may alternatively comprise a lidar system, an ultrasonic system or any other suitable system in which waves are emitted by an emitter and reflected waves which return to the vehicle are analysed as will be understood by those of skill in the art.
  • Data from both the camera 1 and the radar system 2 are output to a pre-selector 3, which identifies potential objects of interest in the data.
  • FIG 2 simplified images 4,5 from the camera system 1 and the radar system 2 are shown, in the case where a pedestrian 6 is positioned in front of the vehicle.
  • the image 4 from the camera 1 comprises an image of the pedestrian 6, whereas the image 5 from the radar system 2 indicates that, in the same region of space, there is an object which is closer to the vehicle than other detected '"background" objects,
  • the pre-classif ⁇ r 3 is operable to determine that the data from the camera 1 and radar system 2 contain an object of potential interest, and corresponding regions 7,8 of the images 4,5 taken by the camera i and radar system 2 are identified by the pre ⁇ selector for further analysis. These regions 7,8 of the images 4,5 are the regions containing the data relating to the object of potential interest.
  • data from the selected regions 7,8 of the images 4,5 is combined, and the combined camera and radar data is then passed to a trained classifier 9, As discussed above, an object classifier is trained through exposure to different types of objects in different circumstances, so that the classifier is able to provide a high degree of accuracy in classifying objects in new data that is presented. Sn this case, the classifier 9 is trained by repeated exposure to combined camera and radar data relating to different types of object.
  • the classifier 9 analyses the combined camera and radar data io provide a determination as to the type of object that appears in the combined data.
  • classifiers might be used. Examples are neural network classifiers, support vector machine classifiers or Adaptive Boosting ("Adaboost") classifiers.
  • Adaboost Adaptive Boosting
  • Examples of features in the combined raw camera/radar data that could be analysed to classify objects in the data include grey scale values, gradients, patterns, amplitudes and (for the radar data only) phase information.
  • the classifier 9 may analyse the data to determine whether it is one of several types of essential object in a single step. Alternatively, as shown in figure 4, the classifier 9 may analyse the raw combined camera/radar data separately for each potential type of object, in parallel. Referring to figure 4, the raw data is input to a pedestrian classifier 10, a pole classifier 11 and a vehicle classifier 12, and each classifier analyses the data and provides a determination as to whether the data contains an object of that particular classification, or alternatively outputs a measure of probability that the data contains that type of object.
  • each classifier 10,11,12 is operable to output a "negative" signal if it is determined that the data does not contain an object of that particular type.
  • classifiers 10.11,12 out negative signals, this may be passed to an error generator 13, which outputs a signal indicating that the classifier 9 has not been able to identify the object.
  • the system may default to a "safest " ' mode in which both occupant and pedestrian safety systems are activated if activation would be triggered by the object being of the most hazardous type.
  • the combined camera/radar data is also input to an impact evaluator 14, which analyses the data in parallel with the classifier 9.
  • the impact evaluator 14 analyses the data to calculate the likelihood of the vehicle being involved in an impact with the object in question, and will also make a determination as to the likely time of the impact, and the relative speed and/or orientation of the vehicle and the object at the predicted impact.
  • the impact evaluator 14 may consider only the camera data, or only the radar data. Indeed, the impact evaluator 14 may take data from a further sensor 17, such as an accelerometer or impact sensor, alone or in combination with the camera and/or radar data.
  • only data from the camera 1 and radar system 2 in one or more regions identified by the pre-classifier 3 as containing an object of potential interest are analysed by the classifier 9.
  • the pre-classifier 3 may discard other data and only pass on data relating to these regions, or the data relating to these regions may be "marked" as being for analysis.
  • Outputs from the classifier 9 and the impact evaluator 14 are then passed to an algorithm 15 for pedestrian protection, and also to an algorithm 16 for occupant protection.
  • These algorithms 15,16 will coordinate the deployment, if appropriate, of one or more safety system to protect pedestrians and/or vehicle occupants, in dependence upon the outputs from the classifier 9, which provides an indication of the type of object, and from the impact evaluator 14.

Abstract

A classification system for classifying objects in the vicinity of a vehicle, the system comprising: a video camera (1) for gathering camera data; a reflected radiation system (2) for gathering reflected radiation data; and a classifier (9), wherein raw data from the camera (1) and the reflected radiation system (2) are combined and analysed by the classifier (9), the classifier (9) being configured to provide an output relating to the type of an object (6) that appears in data gathered by both the camera (1) and the reflected radiation system (2).

Description

A system for classifying objects in the vicinity of a vehicle.
Description of invention
THIS INVENTION relates to a classification system, and in particular concerns a system for classifying objects in the vicinity of a vehicle.
Modern motor vehicles are typically equipped with several different safety systems, which are adapted to protect both occupants of the vehicle (in the case of internal air- bags or seal belt pretεnsioners), or pedestrians (for example, bonnet lifters and external air-bag.
When it is determined by vehicle crash sensors that an impact is imminent, or that crash is occurring, one or more of these safety systems may be deployed, or a safety system may be deployed in one of a plurality of possible modes, depending in part upon the type of the other object that is involved. For instance, an impact with a pole or tree at a given speed may be more severe than in impact with another vehicle. If it appears that the vehicle is about to strike a pedestrian, then an external air-bag or bonnet lifter may be activated, but if the vehicle is about to strike an inanimate object such as a tree then there is no need for these protection systems to be deployed.
Accurate classification of objects in the vicinity of a vehicle is therefore important for vehicle safety systems to be activated in the most appropriate manner.
Existing object classifiers comprise computer programs which are operable to analyse data from a vehicle sensor, such as a camera or radar system. The classifier is "trained'5 with exposure to many different types of object in different circumstances, so that the program is able to make an accurate determination as to the type of a new kind of object that is detected,
It is an object of the present invention to provide an improve classification system of this type. Accordingly, one aspect of the present invention provides a classification system for classifying objects in the vicinity of a vehicle, the system comprising: a video camera for gathering camera data; a reflected radiation system for gathering reflected radiation data; and a classifier, wherein raw data from the camera and the reflected radiation system are combined and analysed by the classifier, the classifier being configured to provide an output relating to the type of an object that appears in data gathered by both the camera and the reflected radiation system,
Advantageously, the determination comprises an indication that the object is a certain type of object.
Alternatively, the determination comprises a probability that the object is of a certain type,
Preferably, the classifier comprises a neural network, a support vector machine or an adaptive boosting classifier.
Conveniently, the classifier is trained to discriminate between different types of object.
Advantageously, the classifier is trained to discriminate between pedestrians and other types of object.
Preferably, the classifier is trained to discriminate between pedestrians, vehicles and pole- like objects.
Conveniently, the output of the classifier is used to control the deployment of a pedestrian protection device,
Advantageously, the output of the classifier is used to control the deployment of both a pedestrian protection device and an occupant protection device. Preferably, the data gathered by the camera and the reflected radiation system are input into a pre-selector., which is configured to analyse the data from the camera and the reflected radiation system and to identify regions of the data which contain an object of potential interest.
Conveniently, only the regions identified by the pre-selector are analysed by the
Advantageously, the data from the camera and/or the reflected radiation system is analysed by an impact evaluator, which evaluates the risk of an impact between the vehicle and the object of potential interest.
Preferably, outputs from both the classifier and the impact evaluator are used to control the deployment of one or more protection devices.
Conveniently, the protection devices are pedestrian protection devices and/or occupant protection devices.
Advantageously, the reflected radiation system is a radar system.
In order that the present invention may be more readily understood, embodiments thereof will now be described, by way of example, with reference to the accompanying drawings, in which:
Figures 1 and 4 show schematic views of the components of a classification system embodying the present invention; and
Figures 2 and 3 show schematic views of data gathered by different on-board vehicle sensors. Referring firstly to figure I5 components of a system embodying the present invention are shown. A vehicle camera 1 and a radar system 2 are both provided, being mounted on a vehicle so that the fields of view of the camera 1 and radar system 2 are overlapping,
The camera 1 may be a "mono" camera, may comprise a stereo camera system, may comprise an infrared (IR) camera or be any other suitable kind of camera that forms an image from received light, which need not fall within the visible portion of the spectrum.
The radar system 2 may be a conventional radar system, or may alternatively comprise a lidar system, an ultrasonic system or any other suitable system in which waves are emitted by an emitter and reflected waves which return to the vehicle are analysed as will be understood by those of skill in the art.
Data from both the camera 1 and the radar system 2 are output to a pre-selector 3, which identifies potential objects of interest in the data.
Referring to figure 2, simplified images 4,5 from the camera system 1 and the radar system 2 are shown, in the case where a pedestrian 6 is positioned in front of the vehicle. The image 4 from the camera 1 comprises an image of the pedestrian 6, whereas the image 5 from the radar system 2 indicates that, in the same region of space, there is an object which is closer to the vehicle than other detected '"background" objects,
The pre-classifϊεr 3 is operable to determine that the data from the camera 1 and radar system 2 contain an object of potential interest, and corresponding regions 7,8 of the images 4,5 taken by the camera i and radar system 2 are identified by the pre~selector for further analysis. These regions 7,8 of the images 4,5 are the regions containing the data relating to the object of potential interest. Returning to figure 1, data from the selected regions 7,8 of the images 4,5 is combined, and the combined camera and radar data is then passed to a trained classifier 9, As discussed above, an object classifier is trained through exposure to different types of objects in different circumstances, so that the classifier is able to provide a high degree of accuracy in classifying objects in new data that is presented. Sn this case, the classifier 9 is trained by repeated exposure to combined camera and radar data relating to different types of object.
The classifier 9 analyses the combined camera and radar data io provide a determination as to the type of object that appears in the combined data.
The skilled person will appreciate that different types of classifier might be used. Examples are neural network classifiers, support vector machine classifiers or Adaptive Boosting ("Adaboost") classifiers.
Examples of features in the combined raw camera/radar data that could be analysed to classify objects in the data include grey scale values, gradients, patterns, amplitudes and (for the radar data only) phase information.
The classifier 9 may analyse the data to determine whether it is one of several types of essential object in a single step. Alternatively, as shown in figure 4, the classifier 9 may analyse the raw combined camera/radar data separately for each potential type of object, in parallel. Referring to figure 4, the raw data is input to a pedestrian classifier 10, a pole classifier 11 and a vehicle classifier 12, and each classifier analyses the data and provides a determination as to whether the data contains an object of that particular classification, or alternatively outputs a measure of probability that the data contains that type of object.
In preferred embodiments, each classifier 10,11,12 is operable to output a "negative" signal if it is determined that the data does not contain an object of that particular type.
If all of the classifiers 10.11,12 out negative signals, this may be passed to an error generator 13, which outputs a signal indicating that the classifier 9 has not been able to identify the object. In this case, the system may default to a "safest"' mode in which both occupant and pedestrian safety systems are activated if activation would be triggered by the object being of the most hazardous type.
The combined camera/radar data is also input to an impact evaluator 14, which analyses the data in parallel with the classifier 9. The impact evaluator 14 analyses the data to calculate the likelihood of the vehicle being involved in an impact with the object in question, and will also make a determination as to the likely time of the impact, and the relative speed and/or orientation of the vehicle and the object at the predicted impact. Instead of analysing the combined data, in alternative embodiments the impact evaluator 14 may consider only the camera data, or only the radar data. Indeed, the impact evaluator 14 may take data from a further sensor 17, such as an accelerometer or impact sensor, alone or in combination with the camera and/or radar data.
In preferred embodiments, only data from the camera 1 and radar system 2 in one or more regions identified by the pre-classifier 3 as containing an object of potential interest are analysed by the classifier 9. The pre-classifier 3 may discard other data and only pass on data relating to these regions, or the data relating to these regions may be "marked" as being for analysis.
Outputs from the classifier 9 and the impact evaluator 14 are then passed to an algorithm 15 for pedestrian protection, and also to an algorithm 16 for occupant protection. These algorithms 15,16 will coordinate the deployment, if appropriate, of one or more safety system to protect pedestrians and/or vehicle occupants, in dependence upon the outputs from the classifier 9, which provides an indication of the type of object, and from the impact evaluator 14.
The use of combined camera and radar data for analysis in a trained classifier will allow more accurate and robust classification of objects than analysis of either of camera or radar data in isolation. Depending on the particular circumstances, an object may have more distinctive characteristics when camera data or radar data containing the object are considered, and combining these two types of data will ensure that the most clearly defining characteristic of a certain object can always be taken into account.
When used in this specification and claims, the terms "comprises" and "comprising" and variations thereof mean that the specified features, steps or integers are included. The terms are not to be interpreted to exclude the presence of other features, steps or components.
The features disclosed in the foregoing description, or the following claims, or the accompanying drawings, expressed in their specific forms or in terms of a means for performing the disclosed function, or a method or process for attaining the disclosed result, as appropriate, may, separately, or in any combination of such features, be utilised for realising the invention in diverse forms thereof.

Claims

CLAIMS;
1. A classification system for classifying objects in the vicinity of a vehicle, the system comprising: a video camera { 1) for gathering camera data; a reflected radiation system (2) for gathering reflected radiation data; and a classifier (9), wherein raw data from the camera (1) and the reflective radiation system (2) are combined and analysed by the classifier (9), the classifier (9) being configured to provide an output relating to the type of an object (6) that appears in data gathered by both the camera (1) and the reflected radiation system (2).
2, A system according to claim 1, wherein the determination comprises an indication that the object (6) is a certain type of object,
3. A system according to claim 1, wherein the determination comprises a probability that the object (6) is of a certain type.
4. A system according to any preceding claim, wherein the classifier (9) comprises a neural network, a support vector machine or an adaptive boosting classifier.
5. A system according to any preceding claim, wherein the classifier (9) is trained to discriminate between different types of object.
6. A classifier according to claim 5, wherein the classifier (9) is trained to discriminate between pedestrians and other types of object.
7. A classifier according to claim 5 or 6, wherein the classifier (9) is trained to discriminate between pedestrians, vehicles and pole-like objects.
8. A system according to any preceding claim, wherein the output of the classifier (9) is used to control the deployment of a pedestrian protection device.
9. Λ system according to claim 8, wherein the output of the classifier (9) is used to control the deployment of both a pedestrian protection device and an occupant protection device.
10. Λ system according to any preceding claim, wherein the data gathered by the camera (I) and the reflected radiation system (2) are input into a pre-selector, which is configured to analyse the data from the camera and the reflected radiation system (2) and to identify regions of the data which contain an object of potential interest,
11. A system according to claim 10, wherein only ihe regions identified b> the preselector are analysed by the classifier (9).
12. A system according to any preceding claim, wherein the data from the camera (1) and/or the reflected radiation system (2) is analysed by an impact evaluator (14), which evaluates the risk of an impact between the vehicle and the object of potential interest.
13. A system according to claim 12, wherein outputs from both the classifier (9) and the impact evaluator (14) arc used to control the deployment of one or more protection devices.
14. A system according to claim 13, wherein the protection devices are pedestrian protection devices and/or occupant protection devices.
15. A system according to any preceding claim, wherein the reflected radiation system is a radar system.
PCT/SE2007/050901 2007-11-26 2007-11-26 A system for classifying objects in the vicinity of a vehicle WO2009070069A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/SE2007/050901 WO2009070069A1 (en) 2007-11-26 2007-11-26 A system for classifying objects in the vicinity of a vehicle
EP07852173A EP2212160A4 (en) 2007-11-26 2007-11-26 A system for classifying objects in the vicinity of a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SE2007/050901 WO2009070069A1 (en) 2007-11-26 2007-11-26 A system for classifying objects in the vicinity of a vehicle

Publications (1)

Publication Number Publication Date
WO2009070069A1 true WO2009070069A1 (en) 2009-06-04

Family

ID=40678803

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2007/050901 WO2009070069A1 (en) 2007-11-26 2007-11-26 A system for classifying objects in the vicinity of a vehicle

Country Status (2)

Country Link
EP (1) EP2212160A4 (en)
WO (1) WO2009070069A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013033351A3 (en) * 2011-08-30 2013-06-27 5D Robotics, Inc. Graphical rendition of multi-modal data
WO2016126315A1 (en) * 2015-02-06 2016-08-11 Delphi Technologies, Inc. Autonomous guidance system
WO2018047115A1 (en) * 2016-09-08 2018-03-15 Mentor Graphics Development (Deutschland) Gmbh Object recognition and classification using multiple sensor modalities
US10150414B2 (en) 2016-07-08 2018-12-11 Ford Global Technologies, Llc Pedestrian detection when a vehicle is reversing
US10317901B2 (en) 2016-09-08 2019-06-11 Mentor Graphics Development (Deutschland) Gmbh Low-level sensor fusion
US10520904B2 (en) 2016-09-08 2019-12-31 Mentor Graphics Corporation Event classification and object tracking
US10553044B2 (en) 2018-01-31 2020-02-04 Mentor Graphics Development (Deutschland) Gmbh Self-diagnosis of faults with a secondary system in an autonomous driving system
US10678240B2 (en) 2016-09-08 2020-06-09 Mentor Graphics Corporation Sensor modification based on an annotated environmental model
EP3648006A4 (en) * 2017-09-22 2020-07-29 Samsung Electronics Co., Ltd. Method and apparatus for recognizing object
US10884409B2 (en) 2017-05-01 2021-01-05 Mentor Graphics (Deutschland) Gmbh Training of machine learning sensor data classification system
US10948924B2 (en) 2015-02-06 2021-03-16 Aptiv Technologies Limited Method and apparatus for controlling an autonomous vehicle
US10991247B2 (en) 2015-02-06 2021-04-27 Aptiv Technologies Limited Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure or other vehicles
US11067996B2 (en) 2016-09-08 2021-07-20 Siemens Industry Software Inc. Event-driven region of interest management
US11145146B2 (en) 2018-01-31 2021-10-12 Mentor Graphics (Deutschland) Gmbh Self-diagnosis of faults in an autonomous driving system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010045981A1 (en) * 2000-05-24 2001-11-29 Joachim Gloger Camera-based precrash detection system
US20030060956A1 (en) * 2001-09-21 2003-03-27 Ford Motor Company Method for operating a pre-crash sensing system with object classifier in a vehicle having a countermeasure system
US20030114964A1 (en) * 2001-12-19 2003-06-19 Ford Global Technologies, Inc. Simple classification scheme for vehicle/pole/pedestrian detection
US20050131646A1 (en) * 2003-12-15 2005-06-16 Camus Theodore A. Method and apparatus for object tracking prior to imminent collision detection
EP1760632A2 (en) * 2005-08-30 2007-03-07 Fuji Jukogyo Kabushiki Kaisha Image processing equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6269308B1 (en) * 1998-08-20 2001-07-31 Honda Giken Kogyo Kabushiki Kaisha Safety running system for vehicle
JP2007148835A (en) * 2005-11-28 2007-06-14 Fujitsu Ten Ltd Object distinction device, notification controller, object distinction method and object distinction program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010045981A1 (en) * 2000-05-24 2001-11-29 Joachim Gloger Camera-based precrash detection system
US20030060956A1 (en) * 2001-09-21 2003-03-27 Ford Motor Company Method for operating a pre-crash sensing system with object classifier in a vehicle having a countermeasure system
US20030114964A1 (en) * 2001-12-19 2003-06-19 Ford Global Technologies, Inc. Simple classification scheme for vehicle/pole/pedestrian detection
US20050131646A1 (en) * 2003-12-15 2005-06-16 Camus Theodore A. Method and apparatus for object tracking prior to imminent collision detection
EP1760632A2 (en) * 2005-08-30 2007-03-07 Fuji Jukogyo Kabushiki Kaisha Image processing equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2212160A4 *

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9731417B2 (en) 2011-08-30 2017-08-15 5D Robotics, Inc. Vehicle management system
US9195911B2 (en) 2011-08-30 2015-11-24 5D Robotics, Inc. Modular robotic manipulation
WO2013033351A3 (en) * 2011-08-30 2013-06-27 5D Robotics, Inc. Graphical rendition of multi-modal data
US10948924B2 (en) 2015-02-06 2021-03-16 Aptiv Technologies Limited Method and apparatus for controlling an autonomous vehicle
WO2016126315A1 (en) * 2015-02-06 2016-08-11 Delphi Technologies, Inc. Autonomous guidance system
US10209717B2 (en) 2015-02-06 2019-02-19 Aptiv Technologies Limited Autonomous guidance system
US11763670B2 (en) 2015-02-06 2023-09-19 Aptiv Technologies Limited Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure or other vehicles
US11543832B2 (en) 2015-02-06 2023-01-03 Aptiv Technologies Limited Method and apparatus for controlling an autonomous vehicle
US10991247B2 (en) 2015-02-06 2021-04-27 Aptiv Technologies Limited Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure or other vehicles
US10150414B2 (en) 2016-07-08 2018-12-11 Ford Global Technologies, Llc Pedestrian detection when a vehicle is reversing
RU2708469C2 (en) * 2016-07-08 2019-12-09 ФОРД ГЛОУБАЛ ТЕКНОЛОДЖИЗ, ЭлЭлСи Detecting pedestrians when vehicle is reversing
US10520904B2 (en) 2016-09-08 2019-12-31 Mentor Graphics Corporation Event classification and object tracking
US10558185B2 (en) 2016-09-08 2020-02-11 Mentor Graphics Corporation Map building with sensor measurements
US10678240B2 (en) 2016-09-08 2020-06-09 Mentor Graphics Corporation Sensor modification based on an annotated environmental model
US10317901B2 (en) 2016-09-08 2019-06-11 Mentor Graphics Development (Deutschland) Gmbh Low-level sensor fusion
US10740658B2 (en) 2016-09-08 2020-08-11 Mentor Graphics Corporation Object recognition and classification using multiple sensor modalities
US10802450B2 (en) 2016-09-08 2020-10-13 Mentor Graphics Corporation Sensor event detection and fusion
WO2018047115A1 (en) * 2016-09-08 2018-03-15 Mentor Graphics Development (Deutschland) Gmbh Object recognition and classification using multiple sensor modalities
US10585409B2 (en) 2016-09-08 2020-03-10 Mentor Graphics Corporation Vehicle localization with map-matched sensor measurements
US11067996B2 (en) 2016-09-08 2021-07-20 Siemens Industry Software Inc. Event-driven region of interest management
US10884409B2 (en) 2017-05-01 2021-01-05 Mentor Graphics (Deutschland) Gmbh Training of machine learning sensor data classification system
US11170201B2 (en) 2017-09-22 2021-11-09 Samsung Electronics Co., Ltd. Method and apparatus for recognizing object
EP3648006A4 (en) * 2017-09-22 2020-07-29 Samsung Electronics Co., Ltd. Method and apparatus for recognizing object
US10553044B2 (en) 2018-01-31 2020-02-04 Mentor Graphics Development (Deutschland) Gmbh Self-diagnosis of faults with a secondary system in an autonomous driving system
US11145146B2 (en) 2018-01-31 2021-10-12 Mentor Graphics (Deutschland) Gmbh Self-diagnosis of faults in an autonomous driving system

Also Published As

Publication number Publication date
EP2212160A4 (en) 2012-07-04
EP2212160A1 (en) 2010-08-04

Similar Documents

Publication Publication Date Title
EP2212160A1 (en) A system for classifying objects in the vicinity of a vehicle
US8876157B2 (en) System for protection of a vulnerable road user and method for operating the system
US10007854B2 (en) Computer vision based driver assistance devices, systems, methods and associated computer executable code
US8379924B2 (en) Real time environment model generation system
JP4598653B2 (en) Collision prediction device
US7486802B2 (en) Adaptive template object classification system with a template generator
US7480570B2 (en) Feature target selection for countermeasure performance within a vehicle
US9158978B2 (en) Vehicle environment classifying safety system for a motor vehicle
US7616101B2 (en) Device for monitoring the surroundings of a vehicle
EP2484567B1 (en) An onboard perception system
JP5178276B2 (en) Image recognition device
CN104709215B (en) The method of security system and the security system for operating vehicle
WO2016185653A1 (en) Protection control apparatus
EP2562053B1 (en) Method, computer program product and system for determining whether it is necessary to utilize a vehicle's safety equipment and vehicle comprising these
US7636625B2 (en) Device for classifying at least one object with the aid of an environmental sensor system
KR20120117753A (en) Method and controller for recognizing a width of an impact area of an object in the front area of a vehicle
WO2019088028A1 (en) Protection control device and control method of protection control device
US20050125126A1 (en) Pre-crash sensing system and method for detecting and classifying objects
WO2014171863A1 (en) System for controlling the deployment of an external safety device
EP1274608A1 (en) Method for deploying a safety device in a crash
US20050004719A1 (en) Device and method for determining the position of objects in the surroundings of a motor vehicle
US10908259B2 (en) Method for detecting a screening of a sensor device of a motor vehicle by an object, computing device, driver-assistance system and motor vehicle
EP2851840B1 (en) Vision system and vision method for a motor vehicle
US20220009439A1 (en) Enhanced occupant collision safety system
KR20220152590A (en) Apparatus for preventing accidents caught in vehicle door and method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07852173

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2007852173

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE