EP1030188A1 - Situation awareness system - Google Patents

Situation awareness system Download PDF

Info

Publication number
EP1030188A1
EP1030188A1 EP00102979A EP00102979A EP1030188A1 EP 1030188 A1 EP1030188 A1 EP 1030188A1 EP 00102979 A EP00102979 A EP 00102979A EP 00102979 A EP00102979 A EP 00102979A EP 1030188 A1 EP1030188 A1 EP 1030188A1
Authority
EP
European Patent Office
Prior art keywords
data objects
attributed data
events
images
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP00102979A
Other languages
German (de)
French (fr)
Other versions
EP1030188B1 (en
Inventor
Richard C. Waters
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of EP1030188A1 publication Critical patent/EP1030188A1/en
Application granted granted Critical
Publication of EP1030188B1 publication Critical patent/EP1030188B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

Definitions

  • the invention relates generally to a monitoring system rendering a synthetic display derived from multiple cameras mounted at various locations, and more particularly, to alerting a viewer on special situations observed.
  • Video monitoring systems are well known. In the case of vehicles, several types of monitoring systems are in use. Some vehicles, e.g., busses, have cameras mounted so that the driver can view road areas beside or behind the bus. However, there is typically only one camera, and the display merely shows exactly what the camera sees. There is no attempt to analyze the displayed image. These systems simply act as viewing mirrors for hard-to-see areas. Similarly, law enforcement vehicles may capture a historical record of the view from the front window.
  • Some vehicles such as computer controlled cars, also include sensors.
  • the sensors detect potentially dangerous situations, such as, closing-up too rapidly on another vehicle.
  • a variety of sensors have been used, for example, sonar, lasers, and microwaves. These systems do not provide a general situation display, rather they merely detect a few dangerous situations.
  • Radar and sonar systems can produce a situation display, and sometimes do some amount of analysis, for example, as in an air traffic control system.
  • radar and sonar systems are not based on video images, but rather on the processing of reflected signals transmitted at specific frequencies.
  • the systems route multiple video streams to a central location.
  • the video streams can be displayed on corresponding monitors. If the number of cameras is greater than the number of display stations, then the system usually displays camera views in sequence, or on operator demand.
  • These type of systems do not perform analysis, nor do these systems merge multiple streams into a single situation display. At most they may tile multiple independent views on a single screen with time and location annotations.
  • the invention provides a situation awareness system which includes a plurality of cameras. Each camera acquires a sequence of images of overlapping parts of an area of interest. An analyzer merges the sequences of images acquired by the plurality of cameras, and identifies moving objects in the area of interest. A display device displays the merged sequences of images, and information associated with the identified moving objects.
  • the optical flow in temporally successive images of a single video stream are analyzed to generate motion fields.
  • Spatially adjacent images of multiple video stream are registered to obtain depth images.
  • the motion fields and depth images are segmented to generate partially attributed data objects.
  • the partially attributed data objects are converted to fully attributed data objects and events which are displayed as annotated graphic elements and alerts.
  • the viewing orientation of the display is independent of the point of view of the cameras.
  • Figure 1 shows the situation awareness system 100 according to my invention.
  • the system 100 includes multiple video cameras 101-106. Each camera acquires a sequence of images as a video stream 115. Six cameras are shown, fewer or more can be used. Additional cameras can be provided for redundancy in the case of a camera failure, or obstruction. The cameras can be arranged to obtain a full 360 degree field of view of an area of interest around a vehicle.
  • each camera overlap parts of the area of interest such that stereoscopic techniques can be used to extract depth information.
  • Wide angle lenses can be used to increase the amount of overlap without increasing the number of cameras.
  • the output of the cameras, digitized video streams 115, is connected to an analyzer-synthesizer 200.
  • the analyzer-synthesizer 200 analyzes the video streams and generates a synthetic display 300 on an output device 120.
  • the cameras can be mounted on, for example, a vehicle 130 shown by dashed lines in Figure 1.
  • the cameras can also be placed at other fixed or moving locations to observe the area of interest, generally 125, the areas in front of the various lenses.
  • the analyzer-synthesizer 200 operates on the data of the multiple video streams in real-time.
  • the analyzer portion extracts temporal and spatial data from the video streams to identify objects, and their attributes, such as size, position, and velocity.
  • relationships between the identified objects are determined, for example, two vehicles on intersecting courses.
  • the video streams are reduced to a relationship of attributed objects.
  • the attributed objects are analyzed to detect events, for example, a possible collision, or a danger zone.
  • the synthesizer portion generates the situation awareness display 300 of the relationships of the attributed objects, and optional alerts related to the events.
  • the situation awareness display 300 is entirely synthetic.
  • I discard the video stream 115 after it is analyzed.
  • the synthesizer integrates information extracted from the multiple video streams into a single display 300.
  • alert signals 140 may be generated when certain dangerous situations or events are recognized.
  • the alert signals can be displayed, or presented to some other output device 150.
  • the alert signals 140 can initiate evasive collision avoidance action, for example, braking or slowing down.
  • video streams 115 from multiple cameras 101-106 are presented to the analyzer/synthesizer 200, via an A/D converter if necessary, as digital video data 201.
  • Temporal and spatial information is extracted from the digital video data 201.
  • Optical flow analysis 210 is used to determine motion fields 211 from images separated in time (?t), for example, from motion fields of successive frames in a single video sequence.
  • Image registration 220 is used to determine a depth image 221 from images overlapping in space (?x, ?y), for example, using frames taken of overlapping parts of the area of interest by multiple cameras.
  • the depth image specifies the distance (?z) to each pixel in the image.
  • the motions fields and depth image are segmented to produce partially attributed data objects 231. For example, pixels having the same optical flow at the same depth are likely to be related to the same object.
  • Using both the optical flow and distances provides for a robust segmentation, particularly when the flow analysis is done concurrently with the registration so the derived results (motion fields and depth values) correlate with each other (215).
  • the partial attributes can include the size, position, velocity, direction of movement of the objects in thee-dimensional space.
  • the objects are only partially attributed because other attributes that depend on additional knowledge, such as the exact identity of the objects have not yet been determined.
  • the partially attributed data objects 231, in conjunction with an application specific database 239 can be analyzed 240 to generate fully attributed data objects 241 and events 242.
  • a one-sided view of a semi-trailer is sufficient to deduce the entire shape of the object.
  • Various kinds of template matching schemes can be used to fully identify specific commonly occurring objects, such as, other vehicles, pedestrians, bicycles, trucks, and the like.
  • the features may also include lane dividers, side walks, stop signs, guard rails, curbs, buildings, fences, and so forth.
  • the events 242 can be generated by analyzing the relationships among the attributed objects, for example, a potential collision situation, a car drifting off the road, or a fading light situation. Additional sensors 249 can also be used to enrich the number of events that can be detected.
  • a synthesizer 250 converts the fully attributed data objects 241 to annotated graphic elements 251 and alerts 252. The last step renders 260 the graphic elements 251 and alerts 252.
  • the display 300 in Figure 3 shows a bird's eye view of the area of interest with the vehicle 310 on which the situation awareness device is mounted, located at a fixed orientation near the center of the display, and annotated objects moving relative to the point of view. Note, the view is totally synthetic and orthogonal to the view seen by the cameras.
  • Arrows 301 can be used to show the direction of movement of objects that are not stationary. Determining the orientation of the arrows requires an active analysis, as opposed to passively displaying the output of the cameras as done in the prior art. In an area of interest where sufficient ambient light can not be assured, my invention can be extended by including active illumination. In some situations it could benefit from using infrared light, either to see in the dark without requiring active illumination or as inoffensive active illumination. In situations such as fog, where visibility is poor, my invention can operate by carefully selected wavelengths or strobed light sources appropriately synchronized with the shutter of the cameras so as to focus on objects of interest and reject other scattered light.
  • the analyzing step 240 can receive secondary data 238.
  • the data can include vehicle velocity, or position as obtained from a GPS receiver. With the vehicle's velocity, the analysis can be improved and simplified. Positional data enables the use of maps on the display, and actual street and place names.
  • the display 300 is under user control.
  • the user supply control signals 239 to alter the way that the annotated graphic elements and alerts are displayed, or to change the orientation of the point of view. It is also possible to transmit the alerts and graphic elements to a remote location. For instance, while walking toward a parked vehicle, the operator can view, on a portable display device, the area of interest in the vicinity of the vehicle from a safe, location.
  • multiple vehicles can exchange situation information with each other to enhance the scope of the display.
  • Other areas where the invention can be used include airports, waterways, and the like.

Abstract

A situation awareness system includes a plurality of cameras. Each camera acquires a sequence of images of a particular part of an area of interest. There is overlap between the parts so that the system can obtain depth information about objects in the area of interest. An analyzer identifies moving objects in the areas of interest, attributes of the moving objects, and events related to the moving objects. A display device displays the attributed objects and events as annotated graphic elements and alerts.

Description

    FIELD OF THE INVENTION
  • The invention relates generally to a monitoring system rendering a synthetic display derived from multiple cameras mounted at various locations, and more particularly, to alerting a viewer on special situations observed.
  • BACKGROUND OF THE INVENTION
  • Video monitoring systems are well known. In the case of vehicles, several types of monitoring systems are in use. Some vehicles, e.g., busses, have cameras mounted so that the driver can view road areas beside or behind the bus. However, there is typically only one camera, and the display merely shows exactly what the camera sees. There is no attempt to analyze the displayed image. These systems simply act as viewing mirrors for hard-to-see areas. Similarly, law enforcement vehicles may capture a historical record of the view from the front window.
  • Some vehicles, such as computer controlled cars, also include sensors. The sensors detect potentially dangerous situations, such as, closing-up too rapidly on another vehicle. A variety of sensors have been used, for example, sonar, lasers, and microwaves. These systems do not provide a general situation display, rather they merely detect a few dangerous situations.
  • Radar and sonar systems can produce a situation display, and sometimes do some amount of analysis, for example, as in an air traffic control system. However, radar and sonar systems are not based on video images, but rather on the processing of reflected signals transmitted at specific frequencies.
  • Several types of surveillance systems are known. Typically, the systems route multiple video streams to a central location. The video streams can be displayed on corresponding monitors. If the number of cameras is greater than the number of display stations, then the system usually displays camera views in sequence, or on operator demand. These type of systems do not perform analysis, nor do these systems merge multiple streams into a single situation display. At most they may tile multiple independent views on a single screen with time and location annotations.
  • There are also systems that monitor specific places, such as escalators, elevators, toll gates, bank machines, and perimeter fences, in order to determine the occurrence of particular situations. Some of these systems may attempt to analyze the video in order to detect moving objects, for example, to extract a license number. However, these system typically do not combine information from multiple sources, nor do they generate an overall situation display, nor synthesize an image from a different point of view.
  • SUMMARY OF THE INVENTION
  • The invention provides a situation awareness system which includes a plurality of cameras. Each camera acquires a sequence of images of overlapping parts of an area of interest. An analyzer merges the sequences of images acquired by the plurality of cameras, and identifies moving objects in the area of interest. A display device displays the merged sequences of images, and information associated with the identified moving objects.
  • In one aspect of the invention, the optical flow in temporally successive images of a single video stream are analyzed to generate motion fields. Spatially adjacent images of multiple video stream are registered to obtain depth images. The motion fields and depth images are segmented to generate partially attributed data objects. Using an application specific database and analysis, the partially attributed data objects are converted to fully attributed data objects and events which are displayed as annotated graphic elements and alerts. As one feature of the invention, the viewing orientation of the display is independent of the point of view of the cameras.
  • BRIEF DESCRIPTION OF THE DRAWINGSFigure 1 is a block diagram of an awareness system according to the invention;
  • Figure 2 is a block diagram of an analyzer synthesizer of the system of Figure 1; and
  • Figure 3 is an example synthetic image generated by the system of Figure 1.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTSSystem Overview
  • Figure 1 shows the situation awareness system 100 according to my invention. The system 100 includes multiple video cameras 101-106. Each camera acquires a sequence of images as a video stream 115. Six cameras are shown, fewer or more can be used. Additional cameras can be provided for redundancy in the case of a camera failure, or obstruction. The cameras can be arranged to obtain a full 360 degree field of view of an area of interest around a vehicle.
  • For other applications, a smaller field of view is suitable. The images provided by each camera overlap parts of the area of interest such that stereoscopic techniques can be used to extract depth information. Wide angle lenses can be used to increase the amount of overlap without increasing the number of cameras.
  • The output of the cameras, digitized video streams 115, is connected to an analyzer-synthesizer 200. The analyzer-synthesizer 200, according to my invention, analyzes the video streams and generates a synthetic display 300 on an output device 120.
  • System Operation
  • In an operational system, the cameras can be mounted on, for example, a vehicle 130 shown by dashed lines in Figure 1. The cameras can also be placed at other fixed or moving locations to observe the area of interest, generally 125, the areas in front of the various lenses.
  • The analyzer-synthesizer 200 operates on the data of the multiple video streams in real-time. The analyzer portion extracts temporal and spatial data from the video streams to identify objects, and their attributes, such as size, position, and velocity. In addition, relationships between the identified objects are determined, for example, two vehicles on intersecting courses. In other words the video streams are reduced to a relationship of attributed objects. The attributed objects are analyzed to detect events, for example, a possible collision, or a danger zone. The synthesizer portion generates the situation awareness display 300 of the relationships of the attributed objects, and optional alerts related to the events.
  • According to my invention, the situation awareness display 300 is entirely synthetic. In contrast with the prior art, I discard the video stream 115 after it is analyzed. In addition, the synthesizer integrates information extracted from the multiple video streams into a single display 300. Furthermore, alert signals 140 may be generated when certain dangerous situations or events are recognized. The alert signals can be displayed, or presented to some other output device 150. In an alternative embodiment, the alert signals 140 can initiate evasive collision avoidance action, for example, braking or slowing down.
  • Synthesizer-Analyzer
  • As shown in Figure 2, video streams 115 from multiple cameras 101-106 are presented to the analyzer/synthesizer 200, via an A/D converter if necessary, as digital video data 201. Temporal and spatial information is extracted from the digital video data 201.
  • Optical flow analysis 210 is used to determine motion fields 211 from images separated in time (?t), for example, from motion fields of successive frames in a single video sequence.
  • Image registration 220 is used to determine a depth image 221 from images overlapping in space (?x, ?y), for example, using frames taken of overlapping parts of the area of interest by multiple cameras. The depth image specifies the distance (?z) to each pixel in the image.
  • The motions fields and depth image are segmented to produce partially attributed data objects 231. For example, pixels having the same optical flow at the same depth are likely to be related to the same object. Using both the optical flow and distances provides for a robust segmentation, particularly when the flow analysis is done concurrently with the registration so the derived results (motion fields and depth values) correlate with each other (215).
  • The partial attributes can include the size, position, velocity, direction of movement of the objects in thee-dimensional space. The objects are only partially attributed because other attributes that depend on additional knowledge, such as the exact identity of the objects have not yet been determined.
  • The partially attributed data objects 231, in conjunction with an application specific database 239 can be analyzed 240 to generate fully attributed data objects 241 and events 242. For example, a one-sided view of a semi-trailer is sufficient to deduce the entire shape of the object. Various kinds of template matching schemes can be used to fully identify specific commonly occurring objects, such as, other vehicles, pedestrians, bicycles, trucks, and the like. In a vehicle application, the features may also include lane dividers, side walks, stop signs, guard rails, curbs, buildings, fences, and so forth.
  • The events 242 can be generated by analyzing the relationships among the attributed objects, for example, a potential collision situation, a car drifting off the road, or a fading light situation. Additional sensors 249 can also be used to enrich the number of events that can be detected.
  • A synthesizer 250 converts the fully attributed data objects 241 to annotated graphic elements 251 and alerts 252. The last step renders 260 the graphic elements 251 and alerts 252.
  • Display
  • Many different types of situation displays are possible. The display 300 in Figure 3 shows a bird's eye view of the area of interest with the vehicle 310 on which the situation awareness device is mounted, located at a fixed orientation near the center of the display, and annotated objects moving relative to the point of view. Note, the view is totally synthetic and orthogonal to the view seen by the cameras.
  • Certain other image features are shown as well, such as pedestrian lane crossing 320, buildings 330, other traffic 340, a bicycle 350, and so forth.
  • Arrows 301 can be used to show the direction of movement of objects that are not stationary. Determining the orientation of the arrows requires an active analysis, as opposed to passively displaying the output of the cameras as done in the prior art.
    In an area of interest where sufficient ambient light can not be assured, my invention can be extended by including active illumination. In some situations it could benefit from using infrared light, either to see in the dark without requiring active illumination or as inoffensive active illumination. In situations such as fog, where visibility is poor, my invention can operate by carefully selected wavelengths or strobed light sources appropriately synchronized with the shutter of the cameras so as to focus on objects of interest and reject other scattered light.
  • In one embodiment of my invention, the analyzing step 240 can receive secondary data 238. In a vehicle application, the data can include vehicle velocity, or position as obtained from a GPS receiver. With the vehicle's velocity, the analysis can be improved and simplified. Positional data enables the use of maps on the display, and actual street and place names.
  • In another embodiment, the display 300 is under user control. For instance, in a building surveillance application, the user supply control signals 239 to alter the way that the annotated graphic elements and alerts are displayed, or to change the orientation of the point of view. It is also possible to transmit the alerts and graphic elements to a remote location. For instance, while walking toward a parked vehicle, the operator can view, on a portable display device, the area of interest in the vicinity of the vehicle from a safe, location.
  • In addition, multiple vehicles can exchange situation information with each other to enhance the scope of the display. Other areas where the invention can be used include airports, waterways, and the like.
  • This invention is described using specific terms and examples. It is to be understood that various other adaptations and modifications may be made within the spirit and scope of the invention. Therefore, it is the object of the appended claims to cover all such variations and modifications as come within the true spirit and scope of the invention.

Claims (11)

  1. A real-time situation awareness system, comprising:
    a plurality of cameras acquiring a plurality of video streams of overlapping parts of an area of interest;
    analyzer means for reducing the plurality of video streams to attributed data objects and events; and
    synthesizer means for rendering the attributed data objects and events as annotated graphic elements and alerts on an output device.
  2. The system of claim 1 further comprising:
    means for temporally analyzing an optical flow in successive images of a single video stream to generate motion fields;
    means for spatially registering adjacent images of multiple video stream to obtain depth images; and
    means for segmenting the motion fields and depth images to generate partially attributed data objects.
  3. The system of claim 2 further comprising:
    means for analyzing the partially attributed data objects using an application specific database to generate fully attributed data objects and events.
  4. The system of claim 3 further comprising:
    sensors providing the analyzing step with secondary data and signals.
  5. The system of claim 1 wherein the synthesizer means produces a display having a point of view substantially orthogonal to the point of view of the cameras.
  6. The system of claim 1 wherein the area of interest is a panoramic scene.
  7. The system of claim 1 wherein annotations for the graphic elements include directions of movement.
  8. The system of claim 5 wherein user control signals determine the display.
  9. A method for generating a real-time situation awareness display, comprising the steps of:
    acquiring a plurality of video streams of overlapping parts of an area of interest;
    reducing the plurality of video streams to attributed data objects and events; and
    rendering the attributed data objects and events as annotated graphic elements and alerts on an output device.
  10. The method of claim 9 further comprising:
    temporally analyzing an optical flow in successive images of a single video stream to generate motion fields;
    spatially registering adjacent images of multiple video stream to obtain depth images; and
    segmenting the motion fields and depth images to generate partially attributed data objects.
  11. The method of claim 10 further comprising:
    analyzing the partially attributed data objects using an application specific database to generate fully attributed data objects and event.
EP00102979A 1999-02-16 2000-02-14 Situation awareness system Expired - Lifetime EP1030188B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US250956 1981-04-01
US09/250,956 US6396535B1 (en) 1999-02-16 1999-02-16 Situation awareness system

Publications (2)

Publication Number Publication Date
EP1030188A1 true EP1030188A1 (en) 2000-08-23
EP1030188B1 EP1030188B1 (en) 2005-06-01

Family

ID=22949871

Family Applications (1)

Application Number Title Priority Date Filing Date
EP00102979A Expired - Lifetime EP1030188B1 (en) 1999-02-16 2000-02-14 Situation awareness system

Country Status (5)

Country Link
US (1) US6396535B1 (en)
EP (1) EP1030188B1 (en)
JP (1) JP3876288B2 (en)
AT (1) ATE297022T1 (en)
DE (1) DE60020420T2 (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003100710A1 (en) * 2002-05-22 2003-12-04 A4Vision Methods and systems for detecting and recognizing objects in a controlled wide area
EP1168248A3 (en) * 2000-06-30 2003-12-10 Matsushita Electric Industrial Co., Ltd. Rendering device
WO2005013235A1 (en) * 2003-07-25 2005-02-10 Robert Bosch Gmbh Device for classifying at least one object in the vicinity of a vehicle
EP1530185A2 (en) * 2003-10-10 2005-05-11 Robert Bosch Gmbh Information apparatus
EP1639519A1 (en) * 2003-06-13 2006-03-29 Sarnoff Corporation Vehicular vision system
EP1639516A2 (en) * 2003-07-02 2006-03-29 Sarnoff Corporation Method and apparatus for ground detection and removal in vision systems
EP1639521A2 (en) * 2003-07-02 2006-03-29 Sarnoff Corporation Stereo-vision based imminent collision detection
EP1641653A2 (en) * 2003-07-02 2006-04-05 Sarnoff Corporation Method and apparatus for pedestrian detection
EP1679529A2 (en) * 2005-01-04 2006-07-12 Robert Bosch Gmbh Object detection method
EP1709568A2 (en) * 2003-12-15 2006-10-11 Sarnoff Corporation Method and apparatus for object tracking prior to imminent collision detection
EP1721287A1 (en) * 2004-03-02 2006-11-15 Sarnoff Corporation Method and apparatus for detecting a presence
US7174033B2 (en) 2002-05-22 2007-02-06 A4Vision Methods and systems for detecting and recognizing an object based on 3D image data
EP1760489A1 (en) * 2005-08-31 2007-03-07 CLARION Co., Ltd. Obstacle detector for vehicle
US7257236B2 (en) 2002-05-22 2007-08-14 A4Vision Methods and systems for detecting and recognizing objects in a controlled wide area
EP2008223A2 (en) * 2006-03-29 2008-12-31 Mark Dronge Security alarm system
WO2009044257A2 (en) * 2007-10-03 2009-04-09 Latecoere Method and system for aircraft taxiing assistance
WO2010094401A1 (en) * 2009-02-17 2010-08-26 Autoliv Development Ab A method and system of automatically detecting objects in front of a motor vehicle
WO2010115580A1 (en) * 2009-04-06 2010-10-14 Daimler Ag Method and apparatus for recognizing objects
DE102010013093A1 (en) * 2010-03-29 2011-09-29 Volkswagen Ag Method for creating model of surrounding area of motor vehicle i.e. car, involves determining whether card cells are loaded with object represented by three- dimensional structures
CN102348100A (en) * 2010-07-30 2012-02-08 江彦宏 Video radar display system
EP2420982A1 (en) * 2010-08-19 2012-02-22 Yan-Hong Chiang Video radar display system
EP2444947A1 (en) * 2010-10-20 2012-04-25 Yan-Hong Chiang Assistant driving system with video radar
EP1964718A3 (en) * 2007-02-27 2012-07-04 Hitachi, Ltd. Image processing apparatus, image processing method and image processing system
US8301344B2 (en) 2003-07-25 2012-10-30 Robert Bosch Gmbh Device for classifying at least one object in the surrounding field of a vehicle
CN103247168A (en) * 2012-02-14 2013-08-14 江彦宏 Remote traffic management system using video radar
EP2843589A1 (en) * 2013-08-29 2015-03-04 Alcatel Lucent A method and platform for sending a message to a communication device associated with a moving object
WO2015082105A1 (en) 2013-12-05 2015-06-11 Robert Bosch Gmbh Method and device for generating an alert by means of two images of a vehicle environment obtained via cameras
US9846812B2 (en) 2014-10-10 2017-12-19 Application Solutions (Electronics and Vision) Ltd. Image recognition system for a vehicle and corresponding method

Families Citing this family (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2259220A3 (en) * 1998-07-31 2012-09-26 Panasonic Corporation Method and apparatus for displaying image
EP1038734B1 (en) * 1998-10-08 2019-05-15 Panasonic Intellectual Property Corporation of America Driving assisting device and recording medium
WO2000064175A1 (en) * 1999-04-16 2000-10-26 Matsushita Electric Industrial Co., Ltd. Image processing device and monitoring system
GB2352859A (en) * 1999-07-31 2001-02-07 Ibm Automatic zone monitoring using two or more cameras
JP2001315603A (en) * 2000-05-09 2001-11-13 Matsushita Electric Ind Co Ltd Drive supporting device
JP3599639B2 (en) * 2000-05-26 2004-12-08 松下電器産業株式会社 Image processing device
US7319479B1 (en) * 2000-09-22 2008-01-15 Brickstream Corporation System and method for multi-camera linking and analysis
US8711217B2 (en) * 2000-10-24 2014-04-29 Objectvideo, Inc. Video surveillance system employing video primitives
US9892606B2 (en) 2001-11-15 2018-02-13 Avigilon Fortress Corporation Video surveillance system employing video primitives
US8564661B2 (en) 2000-10-24 2013-10-22 Objectvideo, Inc. Video analytic rule detection system and method
US7424175B2 (en) 2001-03-23 2008-09-09 Objectvideo, Inc. Video segmentation using statistical pixel modeling
US7183944B2 (en) * 2001-06-12 2007-02-27 Koninklijke Philips Electronics N.V. Vehicle tracking and identification of emergency/law enforcement vehicles
JP3996428B2 (en) * 2001-12-25 2007-10-24 松下電器産業株式会社 Abnormality detection device and abnormality detection system
US20050128304A1 (en) * 2002-02-06 2005-06-16 Manasseh Frederick M. System and method for traveler interactions management
US20030202701A1 (en) * 2002-03-29 2003-10-30 Jonathon Schuler Method and apparatus for tie-point registration of disparate imaging sensors by matching optical flow
US7073158B2 (en) * 2002-05-17 2006-07-04 Pixel Velocity, Inc. Automated system for designing and developing field programmable gate arrays
US6990406B2 (en) * 2002-07-22 2006-01-24 California Institute Of Technology Multi-agent autonomous system
JP2005537608A (en) * 2002-09-02 2005-12-08 サムスン エレクトロニクス カンパニー リミテッド Optical information storage medium, method and apparatus for recording and / or reproducing information on and / or from optical information storage medium
US20040052501A1 (en) * 2002-09-12 2004-03-18 Tam Eddy C. Video event capturing system and method
CA2505831C (en) * 2002-11-12 2014-06-10 Intellivid Corporation Method and system for tracking and behavioral monitoring of multiple objects moving through multiple fields-of-view
US7221775B2 (en) * 2002-11-12 2007-05-22 Intellivid Corporation Method and apparatus for computerized image background analysis
DE10325762A1 (en) * 2003-06-05 2004-12-23 Daimlerchrysler Ag Image processing system for a vehicle
US20050031169A1 (en) * 2003-08-09 2005-02-10 Alan Shulman Birds eye view virtual imaging for real time composited wide field of view
US7286157B2 (en) * 2003-09-11 2007-10-23 Intellivid Corporation Computerized method and apparatus for determining field-of-view relationships among multiple image sensors
US7280673B2 (en) * 2003-10-10 2007-10-09 Intellivid Corporation System and method for searching for changes in surveillance video
US7346187B2 (en) * 2003-10-10 2008-03-18 Intellivid Corporation Method of counting objects in a monitored environment and apparatus for the same
JP2005124010A (en) * 2003-10-20 2005-05-12 Nissan Motor Co Ltd Imaging apparatus
US7171024B2 (en) * 2003-12-01 2007-01-30 Brickstream Corporation Systems and methods for determining if objects are in a queue
US7109889B2 (en) * 2004-03-01 2006-09-19 Honeywell International Inc. Methods and apparatus for surface movement situation awareness
US7672514B2 (en) * 2004-03-02 2010-03-02 Sarnoff Corporation Method and apparatus for differentiating pedestrians, vehicles, and other objects
WO2005086078A1 (en) * 2004-03-02 2005-09-15 Sarnoff Corporation Method and apparatus for classifying an object
US8694475B2 (en) * 2004-04-03 2014-04-08 Altusys Corp. Method and apparatus for situation-based management
US7788109B2 (en) * 2004-04-03 2010-08-31 Altusys Corp. Method and apparatus for context-sensitive event correlation with external control in situation-based management
US20050222895A1 (en) * 2004-04-03 2005-10-06 Altusys Corp Method and Apparatus for Creating and Using Situation Transition Graphs in Situation-Based Management
US20050222810A1 (en) * 2004-04-03 2005-10-06 Altusys Corp Method and Apparatus for Coordination of a Situation Manager and Event Correlation in Situation-Based Management
EP1641268A4 (en) * 2004-06-15 2006-07-05 Matsushita Electric Ind Co Ltd Monitor and vehicle periphery monitor
EP1696669B1 (en) * 2005-02-24 2013-07-03 Aisin Seiki Kabushiki Kaisha Vehicle surrounding monitoring device
US8174572B2 (en) * 2005-03-25 2012-05-08 Sensormatic Electronics, LLC Intelligent camera selection and object tracking
US9036028B2 (en) 2005-09-02 2015-05-19 Sensormatic Electronics, LLC Object tracking and alerts
FR2891934B1 (en) * 2005-10-12 2008-01-18 Valeo Electronique Sys Liaison DEVICE FOR PROCESSING VIDEO DATA FOR A MOTOR VEHICLE
JP4426535B2 (en) * 2006-01-17 2010-03-03 本田技研工業株式会社 Vehicle periphery monitoring device
US7576639B2 (en) * 2006-03-14 2009-08-18 Mobileye Technologies, Ltd. Systems and methods for detecting pedestrians in the vicinity of a powered industrial vehicle
JP2009533778A (en) * 2006-04-17 2009-09-17 オブジェクトビデオ インコーポレイテッド Video segmentation using statistical pixel modeling
US7671728B2 (en) 2006-06-02 2010-03-02 Sensormatic Electronics, LLC Systems and methods for distributed monitoring of remote sites
US7825792B2 (en) * 2006-06-02 2010-11-02 Sensormatic Electronics Llc Systems and methods for distributed monitoring of remote sites
US20080036864A1 (en) * 2006-08-09 2008-02-14 Mccubbrey David System and method for capturing and transmitting image data streams
US20080151049A1 (en) * 2006-12-14 2008-06-26 Mccubbrey David L Gaming surveillance system and method of extracting metadata from multiple synchronized cameras
US8587661B2 (en) * 2007-02-21 2013-11-19 Pixel Velocity, Inc. Scalable system for wide area surveillance
JP5121258B2 (en) * 2007-03-06 2013-01-16 株式会社東芝 Suspicious behavior detection system and method
JP2010533319A (en) * 2007-06-09 2010-10-21 センサーマティック・エレクトロニクス・コーポレーション Systems and methods for integrating video analysis and data analysis / mining
US20090086023A1 (en) * 2007-07-18 2009-04-02 Mccubbrey David L Sensor system including a configuration of the sensor as a virtual sensor device
JP4970195B2 (en) * 2007-08-23 2012-07-04 株式会社日立国際電気 Person tracking system, person tracking apparatus, and person tracking program
US20090091436A1 (en) * 2007-10-05 2009-04-09 Anderson Leroy E Electronic towing ball viewer
JP4561863B2 (en) * 2008-04-07 2010-10-13 トヨタ自動車株式会社 Mobile body path estimation device
JP4553071B1 (en) * 2009-03-31 2010-09-29 コニカミノルタホールディングス株式会社 3D information display device and 3D information display method
WO2010113239A1 (en) * 2009-03-31 2010-10-07 コニカミノルタホールディングス株式会社 Image integration unit and image integration method
US9536348B2 (en) * 2009-06-18 2017-01-03 Honeywell International Inc. System and method for displaying video surveillance fields of view limitations
US20100321500A1 (en) * 2009-06-18 2010-12-23 Honeywell International Inc. System and method for addressing video surveillance fields of view limitations
PL2306426T3 (en) * 2009-10-01 2013-05-31 Kapsch Trafficcom Ag Device for detecting vehicles on a traffic surface
US20110115909A1 (en) * 2009-11-13 2011-05-19 Sternberg Stanley R Method for tracking an object through an environment across multiple cameras
WO2011149558A2 (en) 2010-05-28 2011-12-01 Abelow Daniel H Reality alternate
US9001211B1 (en) * 2010-06-11 2015-04-07 Kervin R. Spivey Surveillance system apparatus
JP4609603B2 (en) * 2010-07-21 2011-01-12 コニカミノルタホールディングス株式会社 3D information display device and 3D information display method
DE102010046433B4 (en) * 2010-09-24 2012-06-21 Grenzebach Maschinenbau Gmbh Apparatus and method for detecting defects in continuously generated float glass
DE102011014699B4 (en) * 2011-03-22 2015-10-29 Audi Ag Method for operating a driver assistance system for protecting a motor vehicle against damage and motor vehicle
CN103105581A (en) * 2013-01-24 2013-05-15 上海毕励电子科技有限公司 Electrical equipment quantity of state collection method based on video recognition
US9696420B2 (en) * 2013-04-09 2017-07-04 Ford Global Technologies, Llc Active park assist object detection
JP6062122B2 (en) 2014-08-21 2017-01-18 三菱電機株式会社 Driving support device, driving support method and program
KR102347249B1 (en) 2014-10-21 2022-01-04 삼성전자주식회사 Method and device to display screen in response to event related to external obejct
US10410072B2 (en) 2015-11-20 2019-09-10 Mitsubishi Electric Corporation Driving support apparatus, driving support system, driving support method, and computer readable recording medium
US10003732B2 (en) 2016-02-25 2018-06-19 Foodim Ltd Depth of field processing
US10678256B2 (en) * 2017-09-28 2020-06-09 Nec Corporation Generating occlusion-aware bird eye view representations of complex road scenes
CN107719367A (en) * 2017-10-26 2018-02-23 西安正昌电子股份有限公司 360 ° of one kind is looked around and position-recognizing system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0626654A2 (en) * 1993-05-25 1994-11-30 Matsushita Electric Industrial Co., Ltd. Apparatus for measuring intervehicle distance by stereo vision
US5410346A (en) * 1992-03-23 1995-04-25 Fuji Jukogyo Kabushiki Kaisha System for monitoring condition outside vehicle using imaged picture by a plurality of television cameras
US5530420A (en) * 1993-12-27 1996-06-25 Fuji Jukogyo Kabushiki Kaisha Running guide apparatus for vehicle capable of keeping safety at passing through narrow path and the method thereof
US5729216A (en) * 1994-03-14 1998-03-17 Yazaki Corporation Apparatus for monitoring vehicle periphery
WO1998045816A1 (en) * 1997-04-07 1998-10-15 Synapix, Inc. Adaptive modeling and segmentation of visual image streams

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5809161A (en) 1992-03-20 1998-09-15 Commonwealth Scientific And Industrial Research Organisation Vehicle monitoring system
US5819016A (en) 1993-10-05 1998-10-06 Kabushiki Kaisha Toshiba Apparatus for modeling three dimensional information
US5793420A (en) 1994-10-28 1998-08-11 Schmidt; William P. Video recording system for vehicle
US5717456A (en) 1995-03-06 1998-02-10 Champion International Corporation System for monitoring a continuous manufacturing process
US5850352A (en) 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US5768443A (en) 1995-12-19 1998-06-16 Cognex Corporation Method for coordinating multiple fields of view in multi-camera
US5969755A (en) * 1996-02-05 1999-10-19 Texas Instruments Incorporated Motion based event detection system and method
US5680123A (en) 1996-08-06 1997-10-21 Lee; Gul Nam Vehicle monitoring system
US5982420A (en) * 1997-01-21 1999-11-09 The United States Of America As Represented By The Secretary Of The Navy Autotracking device designating a target

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5410346A (en) * 1992-03-23 1995-04-25 Fuji Jukogyo Kabushiki Kaisha System for monitoring condition outside vehicle using imaged picture by a plurality of television cameras
EP0626654A2 (en) * 1993-05-25 1994-11-30 Matsushita Electric Industrial Co., Ltd. Apparatus for measuring intervehicle distance by stereo vision
US5530420A (en) * 1993-12-27 1996-06-25 Fuji Jukogyo Kabushiki Kaisha Running guide apparatus for vehicle capable of keeping safety at passing through narrow path and the method thereof
US5729216A (en) * 1994-03-14 1998-03-17 Yazaki Corporation Apparatus for monitoring vehicle periphery
WO1998045816A1 (en) * 1997-04-07 1998-10-15 Synapix, Inc. Adaptive modeling and segmentation of visual image streams

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6999602B2 (en) 2000-06-30 2006-02-14 Matsushita Electric Industrial Co., Ltd. Image generation for assistance of drivers of vehicles
EP1168248A3 (en) * 2000-06-30 2003-12-10 Matsushita Electric Industrial Co., Ltd. Rendering device
US7257236B2 (en) 2002-05-22 2007-08-14 A4Vision Methods and systems for detecting and recognizing objects in a controlled wide area
US7174033B2 (en) 2002-05-22 2007-02-06 A4Vision Methods and systems for detecting and recognizing an object based on 3D image data
WO2003100710A1 (en) * 2002-05-22 2003-12-04 A4Vision Methods and systems for detecting and recognizing objects in a controlled wide area
US7660436B2 (en) 2003-06-13 2010-02-09 Sarnoff Corporation Stereo-vision based imminent collision detection
US7974442B2 (en) 2003-06-13 2011-07-05 Sri International Vehicular vision system
EP1639519A4 (en) * 2003-06-13 2009-07-08 Sarnoff Corp Vehicular vision system
US7957562B2 (en) 2003-06-13 2011-06-07 Sri International Method and apparatus for ground detection and removal in vision systems
EP1639519A1 (en) * 2003-06-13 2006-03-29 Sarnoff Corporation Vehicular vision system
EP1641653A2 (en) * 2003-07-02 2006-04-05 Sarnoff Corporation Method and apparatus for pedestrian detection
EP1639521A2 (en) * 2003-07-02 2006-03-29 Sarnoff Corporation Stereo-vision based imminent collision detection
EP1639516A2 (en) * 2003-07-02 2006-03-29 Sarnoff Corporation Method and apparatus for ground detection and removal in vision systems
EP1641653A4 (en) * 2003-07-02 2009-07-29 Sarnoff Corp Method and apparatus for pedestrian detection
EP1639521A4 (en) * 2003-07-02 2009-07-22 Sarnoff Corp Stereo-vision based imminent collision detection
EP1639516A4 (en) * 2003-07-02 2009-07-15 Sarnoff Corp Method and apparatus for ground detection and removal in vision systems
US8301344B2 (en) 2003-07-25 2012-10-30 Robert Bosch Gmbh Device for classifying at least one object in the surrounding field of a vehicle
WO2005013235A1 (en) * 2003-07-25 2005-02-10 Robert Bosch Gmbh Device for classifying at least one object in the vicinity of a vehicle
EP1530185A3 (en) * 2003-10-10 2005-05-18 Robert Bosch Gmbh Information apparatus
EP1530185A2 (en) * 2003-10-10 2005-05-11 Robert Bosch Gmbh Information apparatus
EP1709568A2 (en) * 2003-12-15 2006-10-11 Sarnoff Corporation Method and apparatus for object tracking prior to imminent collision detection
US7660438B2 (en) 2003-12-15 2010-02-09 Sarnoff Corporation Method and apparatus for object tracking prior to imminent collision detection
EP1709568A4 (en) * 2003-12-15 2009-07-29 Sarnoff Corp Method and apparatus for object tracking prior to imminent collision detection
EP1721287A4 (en) * 2004-03-02 2009-07-15 Sarnoff Corp Method and apparatus for detecting a presence
EP1721287A1 (en) * 2004-03-02 2006-11-15 Sarnoff Corporation Method and apparatus for detecting a presence
EP1679529A2 (en) * 2005-01-04 2006-07-12 Robert Bosch Gmbh Object detection method
EP1679529A3 (en) * 2005-01-04 2006-09-20 Robert Bosch Gmbh Object detection method
CN1924514B (en) * 2005-08-31 2012-01-25 歌乐牌株式会社 Obstacle detector for vehicle
US7557691B2 (en) 2005-08-31 2009-07-07 Clarion Co., Ltd. Obstacle detector for vehicle
EP1760489A1 (en) * 2005-08-31 2007-03-07 CLARION Co., Ltd. Obstacle detector for vehicle
EP2008223A2 (en) * 2006-03-29 2008-12-31 Mark Dronge Security alarm system
EP2008223A4 (en) * 2006-03-29 2010-11-17 Mark Dronge Security alarm system
US7864983B2 (en) 2006-03-29 2011-01-04 Mark Dronge Security alarm system
EP1964718A3 (en) * 2007-02-27 2012-07-04 Hitachi, Ltd. Image processing apparatus, image processing method and image processing system
WO2009044257A3 (en) * 2007-10-03 2009-06-25 Latecoere Method and system for aircraft taxiing assistance
WO2009044257A2 (en) * 2007-10-03 2009-04-09 Latecoere Method and system for aircraft taxiing assistance
FR2922072A1 (en) * 2007-10-03 2009-04-10 Latecoere Sa METHOD AND SYSTEM FOR AIDING AIRCRAFT
US8582818B2 (en) 2009-02-17 2013-11-12 Autoliv Development Ab Method and system of automatically detecting objects in front of a motor vehicle
WO2010094401A1 (en) * 2009-02-17 2010-08-26 Autoliv Development Ab A method and system of automatically detecting objects in front of a motor vehicle
WO2010115580A1 (en) * 2009-04-06 2010-10-14 Daimler Ag Method and apparatus for recognizing objects
DE102010013093A1 (en) * 2010-03-29 2011-09-29 Volkswagen Ag Method for creating model of surrounding area of motor vehicle i.e. car, involves determining whether card cells are loaded with object represented by three- dimensional structures
CN102348100A (en) * 2010-07-30 2012-02-08 江彦宏 Video radar display system
EP2420982A1 (en) * 2010-08-19 2012-02-22 Yan-Hong Chiang Video radar display system
EP2444947A1 (en) * 2010-10-20 2012-04-25 Yan-Hong Chiang Assistant driving system with video radar
TWI396642B (en) * 2010-10-20 2013-05-21
CN103247168A (en) * 2012-02-14 2013-08-14 江彦宏 Remote traffic management system using video radar
EP2629237A1 (en) * 2012-02-14 2013-08-21 Yan-Hong Chiang Remote vehicle management system by video radar
EP2843589A1 (en) * 2013-08-29 2015-03-04 Alcatel Lucent A method and platform for sending a message to a communication device associated with a moving object
WO2015028443A1 (en) * 2013-08-29 2015-03-05 Alcatel Lucent A method and platform for sending a message to a communication device associated with a moving object
US10447637B2 (en) 2013-08-29 2019-10-15 Alcatel Lucent Method and platform for sending a message to a communication device associated with a moving object
WO2015082105A1 (en) 2013-12-05 2015-06-11 Robert Bosch Gmbh Method and device for generating an alert by means of two images of a vehicle environment obtained via cameras
DE102013224954A1 (en) 2013-12-05 2015-06-11 Robert Bosch Gmbh Method and device for generating a warning by means of two images captured by cameras of a vehicle environment
US9846812B2 (en) 2014-10-10 2017-12-19 Application Solutions (Electronics and Vision) Ltd. Image recognition system for a vehicle and corresponding method

Also Published As

Publication number Publication date
EP1030188B1 (en) 2005-06-01
US6396535B1 (en) 2002-05-28
DE60020420D1 (en) 2005-07-07
JP2000244897A (en) 2000-09-08
ATE297022T1 (en) 2005-06-15
DE60020420T2 (en) 2006-05-04
JP3876288B2 (en) 2007-01-31

Similar Documents

Publication Publication Date Title
US6396535B1 (en) Situation awareness system
CA2747337C (en) Multiple object speed tracking system
US20020054210A1 (en) Method and apparatus for traffic light violation prediction and control
KR101999993B1 (en) Automatic traffic enforcement system using radar and camera
KR100862398B1 (en) Automatic police enforcement method of illegal-stopping and parking vehicle having cctv for preventing crime using multiple camera and system thereof
KR101967610B1 (en) Multi lane monitoring system that can recognize vehicle velocity and license plate number of multi lane
Cafiso et al. In-vehicle stereo vision system for identification of traffic conflicts between bus and pedestrian
US20130342700A1 (en) System and method for using pattern matching to determine the presence of designated objects in digital images
EP0878965A2 (en) Method for tracking entering object and apparatus for tracking and monitoring entering object
US11025865B1 (en) Contextual visual dataspaces
US20050278088A1 (en) Method and apparatus for collision avoidance and enhanced visibility in vehicles
KR101496390B1 (en) System for Vehicle Number Detection
CN102442311A (en) Method and device for determining processed image data about a sourround field of a vehicle
US20090315712A1 (en) Surveillance method and system using object based rule checking
KR102282800B1 (en) Method for trackig multi target employing ridar and camera
JPH11203589A (en) Traffic image pickup device and traffic monitoring device
KR102111363B1 (en) Accident monitoring system in tunnel using camera grouping of IoT based
KR101210615B1 (en) Regulation system of u-turn violation vehicle
RU120270U1 (en) PEDESTRIAN CROSSING CONTROL COMPLEX
US20210383688A1 (en) Traffic monitoring and evidence collection system
KR101719799B1 (en) CCTV monitoring system
KR101327348B1 (en) A system for ego-lane detection using defined mask in driving road image
CN114902309A (en) Driving support device, driving support method, and program
JP2012059139A (en) Traffic monitoring system
KR102030736B1 (en) Apparatus for analyzing Multi-Distributed Video Data

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

AX Request for extension of the european patent

Free format text: AL;LT;LV;MK;RO;SI

17P Request for examination filed

Effective date: 20000829

AKX Designation fees paid

Free format text: AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

17Q First examination report despatched

Effective date: 20020204

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT;WARNING: LAPSES OF ITALIAN PATENTS WITH EFFECTIVE DATE BEFORE 2007 MAY HAVE OCCURRED AT ANY TIME BEFORE 2007. THE CORRECT EFFECTIVE DATE MAY BE DIFFERENT FROM THE ONE RECORDED.

Effective date: 20050601

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050601

Ref country code: LI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050601

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050601

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050601

Ref country code: CH

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050601

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050601

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 60020420

Country of ref document: DE

Date of ref document: 20050707

Kind code of ref document: P

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050901

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050901

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050901

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050912

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20051103

NLV1 Nl: lapsed or annulled due to failure to fulfill the requirements of art. 29p and 29m of the patents act
REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20060214

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20060228

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20060228

ET Fr: translation filed
PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

RAP2 Party data changed (patent owner data changed or rights of a patent transferred)

Owner name: MITSUBISHI DENKI KABUSHIKI KAISHA

26N No opposition filed

Effective date: 20060302

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050601

REG Reference to a national code

Ref country code: GB

Ref legal event code: 746

Effective date: 20100615

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20140211

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20140212

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20140417

Year of fee payment: 15

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 60020420

Country of ref document: DE

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20150214

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20151030

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150214

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150901

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150302