US20060140447A1 - Vehicle-monitoring device and method using optical flow - Google Patents

Vehicle-monitoring device and method using optical flow Download PDF

Info

Publication number
US20060140447A1
US20060140447A1 US11/236,236 US23623605A US2006140447A1 US 20060140447 A1 US20060140447 A1 US 20060140447A1 US 23623605 A US23623605 A US 23623605A US 2006140447 A1 US2006140447 A1 US 2006140447A1
Authority
US
United States
Prior art keywords
vehicle
optical flow
target vehicle
area
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/236,236
Inventor
Sang-Cheol Park
Kwang-Soo Kim
Kyong-Ha Park
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KIM, KWANG-SOO, PARK, KYONG-HA, PARK, SANG-CHEOL
Publication of US20060140447A1 publication Critical patent/US20060140447A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/16Special procedures for taking photographs; Apparatus therefor for photographing the track of moving objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes

Definitions

  • the present invention generally relates to a vehicle detection/tracking method and device, and in particular, to a vehicle detection/tracking method and device using an optical flow, in which real-time vehicle detection is accomplished by calculating an optical flow using a vehicle-mounted camera.
  • a vehicle motion is detected using a shadow cast by the vehicle or by horizontal edges of the vehicle in a photographed image.
  • Shadow detection involves designating a vehicle candidate area using a shadow where the vehicle meets a road and detecting the vehicle by checking symmetry of the designated vehicle candidate area.
  • Detection by photograph involves acquiring horizontal edges from a photographed image of the vehicle, designating an area where a predetermined portion of the horizontal edges appears as a vehicle candidate area, and detecting the vehicle using the designated vehicle candidate area and vehicle templates.
  • shadow detection is unreliable at night or in the rain, because the vehicle candidate area is reduced due to the dark and wet conditions. Moreover, since the shadow cast in the morning or evening is long because of the position of the sun, performance is degraded by time zones. In addition, the shadows of two parallel vehicles may overlap on the road surface. As a result, shadow detection may detect two vehicles as one vehicle. Thus, shadow detection has difficulty handling a new vehicle that is not subject to tracking.
  • the horizontal-edge-photograph detection uses information about the color or area of the road, and is not subject to the problems of shadow detection, but detection performance degrades on a curved or uphill/downhill road.
  • an object of the present invention to provide a vehicle detection/tracking method and device using an optical flow, in which a target vehicle is detected in real time by calculating an optical flow using a camera mounted in a source vehicle and tracking the detected target vehicle.
  • the vehicle-monitoring device includes a video pre-processor, an optical flow detector, and a vehicle detector.
  • the video pre-processor processes video data input through a camera.
  • the optical flow detector detects an optical flow from the input video data and separates a background optical flow area from the detected optical flow.
  • the vehicle detector matches the remaining area of the detected optical flow except for the background optical flow area to templates to determine whether the remaining area is an area of a target vehicle.
  • the vehicle-monitoring method includes pre-processing video data input through a camera, detecting an optical flow from the input video data and separating a background optical flow from the detected optical flow, matching the remaining area of the detected optical flow except for the background optical flow to templates, and detecting information on a target vehicle if the remaining area is an area having motions of the target vehicle and detecting the target vehicle.
  • FIG. 1 is a block diagram of a vehicle-monitoring device having vehicle detection/tracking functions according to an embodiment of the present invention
  • FIG. 2 illustrates an input image and an optical flow of the input image according to an embodiment of the present invention
  • FIG. 3 illustrates a change in an optical flow according to a change in speed of a source vehicle according to an embodiment of the present invention
  • FIG. 4 illustrates a change in an optical flow according to a change in a steering wheel angle of a source vehicle according to an embodiment of the present invention
  • FIG. 5 is a view for explaining a process of extracting an optical flow for a target vehicle according to an embodiment of the present invention
  • FIG. 6 illustrates templates for various vehicle types according to an embodiment of the present invention
  • FIG. 7A is a view for calculating motion information according to a vector direction according to an embodiment of the present invention.
  • FIG. 7B is a view for calculating a relative speed according to a vector size according to an embodiment of the present invention.
  • FIG. 8 is a control flowchart for vehicle detection according to an embodiment of the present invention.
  • FIG. 9 is a control flowchart for vehicle tracking according to an embodiment of the present invention.
  • the present invention provides a vehicle detection/tracking device that allows vehicle detection and then tracking using a camera focusing around a source vehicle having the camera mounted therein.
  • an optical flow is acquired from video data that is input through a camera mounted in the source vehicle while driving the source vehicle.
  • a background optical flow generated from driving is acquired, the optical flow and the background optical flow are compared, and a vehicle candidate area is detected using an optical flow of motions of objects around the source vehicle which is acquired as the result of the comparison.
  • a target vehicle is detected by matching a vehicle candidate area to vehicle templates and is then tracked by continuously comparing an optical flow and a background optical flow for the detected target vehicle.
  • the optical flow represents apparent motions between two temporally different video data that are photographed and input from a camera as vectors.
  • a vehicle detection method using optical flow involves comparing pixels of a previously photographed frame and pixels of a currently photographed frame.
  • the method may involve dividing a previous image into a plurality of unit blocks, each having a predetermined number of pixels, dividing a current image into a plurality of unit blocks, each having the same size as a unit block of the previous image, and comparing the unit blocks of the previous image with the unit blocks of the current image while moving the current image pixel-by-pixel to calculate differences between luminance or chrominance values of the previous image pixels and the current image pixels, and expressing previous image pixel motion using vectors based on the calculated differences, and if a vector having a size that is more than a specific value is generated in a specific area, detecting a target vehicle using the vector.
  • FIG. 1 is a block diagram of a vehicle-monitoring device according to an embodiment of the present invention.
  • the vehicle-monitoring device 100 detects a moving target vehicle by analyzing an optical flow of video data input through a camera 105 mounted in a source vehicle and tracks the detected target vehicle.
  • the vehicle-monitoring device 100 includes a video pre-processor 110 that pre-processes an image input through the camera 105 , an optical flow detector 125 that detects an optical flow for detection of a motion of a target vehicle from the input image, a vehicle detector 140 that detects a vehicle candidate area for vehicle detection using the detected optical flow, and a vehicle tracking unit 155 that tracks the detected target vehicle using information about the detected target vehicle.
  • the camera 105 photographs a monitoring area and outputs video data that takes the form of an analog signal.
  • the camera 105 outputs video data to the video pre-processor 110 .
  • a video input unit 115 of the video pre-processor 110 digitizes the input video data to acquire information about a moving target vehicle from the entire video data.
  • a video corrector 120 filters the digital video data to remove noise.
  • the video corrector 120 includes a filter for removing noise, e.g., a noise removing filter from the input video data, such as a Gaussian filter or a median filter.
  • the video corrector 120 corrects the input video data through filtering.
  • the optical flow detector 125 detects target vehicle motion and optical flow to separate an area having motions and a background from the video data. In other words, the optical flow detector 125 detects an optical flow to separate a moving target vehicle and a stationary background from the entire video data,. Specifically, an optical flow calculator 130 of the optical flow detector 125 calculates an optical flow of the whole video data to separate a moving target vehicle and a background that is stationary with respect to the fixed camera 105 . There are various methods for calculating an optical flow.
  • Equation (1) partial differentiation I t on the pixels with respect to time t is acquired.
  • the optical flow detector 125 detects an optical flow as shown in FIG. 2B through a process that uses Equation (1).
  • a target vehicle 200 is moving.
  • Optical flow for the moving target vehicle 200 is detected in the form of vectors 210 as shown in FIG. 2B .
  • the optical flow is expressed as vectors.
  • the area of the target vehicle 200 with smaller vectors and the background, is shown with larger vectors along the direction of motion of the target vehicle 200 .
  • the sizes of the vectors of the optical flow change according to the speed of the moving target vehicle 200 , but the pattern of the vectors 210 is constant.
  • a pattern of an optical flow for a moving target vehicle and a pattern of an optical flow for a stationary background are generated as shown in FIG. 2B .
  • a pattern of an optical flow for a background is maintained constant so that another pattern of an optical flow for another area, except for an area displaying the pattern, is displayed.
  • an optical flow for the other area takes another pattern that is different from that of the background.
  • An optical flow analyzer 135 stores information about the area for which the optical flow takes another pattern, i.e., information about a vehicle candidate area, in units of a square area, i.e., in units of a pixel. In this way, by separating an optical flow for a moving target vehicle from an optical flow for a background, all area other than the optical flow for the background is designated as a vehicle candidate area.
  • the optical flow analyzer 135 detects the area except for the background from the optical flow, which is calculated by the optical flow calculator 130 using Equation (1). To separate the optical flow for the background according to an embodiment of the present invention, the optical flow analyzer 135 refers to an optical flow lookup table.
  • the optical flow lookup table tabulates optical flows mapped to steering wheel angles and speeds of a source vehicle, stored in the form of a database.
  • FIG. 3 illustrates a change in an optical flow according to a change in a speed of a source vehicle according to an embodiment of the present invention
  • FIG. 4 illustrates a change in an optical flow according to a change in a steering wheel angle of the source vehicle according to the present invention.
  • FIG. 3 in an optical flow directed outward from its center, vectors inside a dotted circle in FIG. 3A are expressed larger than those in FIG. 3B as the speed of the source vehicle increases. Each optical flow is mapped to a corresponding speed of the source vehicle.
  • an optical flow is also expressed differently according to a steering wheel angle of the source vehicle.
  • the optical flow lookup table includes not only an optical flow according to either the steering wheel angle of the source vehicle or the speed of the source vehicle, but also an optical flow according to both the steering wheel angle of the source vehicle and the speed of the source vehicle.
  • Such an optical flow according to the steering wheel angle and/or speed of the source vehicle will be referred to as a background optical flow.
  • the vehicle detector 140 extracts an area by removing the background optical flow to detect the target vehicle in the area. In other words, the vehicle detector 140 performs vehicle detection including determination of an actual type of the moving target vehicle once the vehicle candidate area is designated.
  • a candidate area detector 145 of the vehicle detector 140 separates the moving target vehicle optical flow and the background optical flow the video data and performs an operation for removing the background optical flow.
  • the candidate area detector 145 searches in the optical flow lookup table for the background optical flow corresponding to current speed and steering wheel angle of the source vehicle and extracts the background optical flow by finding the matching optical flows in the lookup table.
  • the candidate area detector 145 compares an optical flow of the entire video data with the background optical flow corresponding to current speed and steering wheel angle of the source vehicle using the optical flow lookup table, hereinafter referred to as matching. If there is an area matched to the background optical flow as the result of the comparison, the optical flow for the target vehicle remains after removing the matched area from the optical flow of the entire video data.
  • FIG. 5 is a view for explaining a process of extracting an optical flow for a target vehicle according to an embodiment of the present invention. More specifically, FIG. 5A illustrates optical flow of the entire video data of FIG. 2B . The optical flow of FIG. 5A is divided into a background optical flow 500 and an optical flow 510 for the target vehicle.
  • the candidate area detector 145 searches the optical flow lookup table for an optical flow corresponding to the current speed and steering wheel angle of the source vehicle and detects an optical flow 520 as shown in FIG. 5B .
  • the candidate area detector 145 matches an optical flow of the entire video data to a background optical flow 520 of FIG. 5B to determine whether the entire video data includes an area that matches the background optical flow 520 .
  • the candidate area detector 145 removes the background optical flow 500 . In this way, the background optical flow is removed from the optical flow of the entire video data and only the optical flow 510 for the target vehicle remains, thereby detecting the vehicle candidate area.
  • a template-matching unit 150 examines the correlation between the vehicle candidate area and previously created templates for various vehicle types for vehicle candidate detection.
  • vehicle detection using templates for various vehicle types involves previously preparing an image of each vehicle type, creating templates with an average image of each vehicle type, and detecting a vehicle using the created templates.
  • templates are shown, each of which is an average image of each vehicle type.
  • a van template 600 acquired by averaging images of vans is shown.
  • an SUV template 610 an automobile template 620
  • a bus template 630 an automobile template 630
  • a truck template 640 is shown in FIG. 6 .
  • the bus template 630 is acquired by averaging images of the buses.
  • the template-matching unit 150 compares the detected vehicle candidate area with templates as shown in FIG. 6 to determine whether the detected vehicle candidate area corresponds to an actual image of the target vehicle.
  • template matching is used to detect the actual target vehicle from the vehicle candidate area.
  • PCA Principle Component Analysis
  • SVM Support Vector Machine
  • a result from the template-matching unit 150 indicates not only whether the vehicle candidate area corresponds to an actual image of the target vehicle but also the type of the target vehicle.
  • a vehicle candidate area is detected from the entire video data, the target vehicle is recognized through template matching in the detected vehicle candidate area, and the type of the target vehicle is determined.
  • vehicle tracking After completion of vehicle detection, vehicle tracking is available.
  • the vehicle-tracking unit 155 performs the vehicle tracking. To track a moving target vehicle, the vehicle-tracking unit 155 predicts a next position of the target vehicle using a prediction algorithm.
  • a vehicle information unit 160 of the vehicle tracking unit 155 predicts the position of the target vehicle in the next frame based on vehicle information output from the template matching unit 150 , i.e., information on a current image of the moving target vehicle and information on a previous image of the moving target vehicle.
  • the vehicle information unit 160 compares information on a vehicle area in the previous image and a vehicle prediction area in the current image through optical flow detection.
  • FIG. 7A is a view for calculating motion information according to a vector direction according to an embodiment of the present invention.
  • FIG. 7B is a view for calculating a relative speed according to a vector size according to an embodiment of the present invention.
  • the vehicle information unit 160 recognizes that a relative distance between the target vehicle and the source vehicle increases if the vector 700 is directed upward. On the other hand, if the vector 710 is directed downward, the vehicle information unit 160 recognizes that a relative distance between the target vehicle and the source vehicle decreases.
  • the vehicle information unit 160 recognizes that a relative speed between the target vehicle and the source vehicle increases if a vector size increases. On the other hand, if the vector size decreases, the vehicle information unit 160 recognizes that the relative speed decreases. Thus, the vehicle information unit 160 calculates the relative speed and motion information of the target vehicle to predict the target vehicle position.
  • a tracking state determining unit 165 determines the accuracy of the tracking by comparing information on the actual position of the target vehicle in the next frame with the predicted information on the target vehicle. If there is an error within a predetermined range, vehicle tracking continues. Otherwise, information on the target vehicle is acquired again from the input video data and vehicle tracking is automatically performed again. When a new vehicle area is generated, the tracking state determining unit 165 also acquires information on the new target vehicle and tracks the new target vehicle through detection of the new vehicle area.
  • FIG. 8 is a control flowchart for vehicle detection according to an embodiment of the present invention.
  • the vehicle-monitoring device photographs a monitoring area through the camera 105 in step 800 .
  • the vehicle-monitoring device then pre-processes input video data acquired through the photographs.
  • the vehicle-monitoring device detects an optical flow from the pre-processed input video data in step 805 and proceeds to step 810 to compare the detected optical flow with a background optical flow in the optical flow lookup table.
  • the vehicle-monitoring device proceeds to step 815 to determine whether there is an area that matches the background optical flow in the optical flow of the entire video data. If there is a match, only the area matched to the background optical flow is removed from the entire image in step 820 .
  • a vehicle candidate area is detected in step 825 and is matched to previously stored vehicle templates in step 830 .
  • the vehicle-monitoring device proceeds to step 835 to detect information on whether the detected vehicle candidate area corresponds to an actual image of the target vehicle, or to detect vehicle information, such target vehicle type. In this way, vehicle detection is performed on the entire video data input through the camera 105 .
  • FIG. 9 is a control flowchart for vehicle tracking according to an embodiment of the present invention.
  • the video pre-processor 110 pre-processes input video data acquired from the photographs in step 910 .
  • the optical flow detector 125 detects an optical flow from the pre-processed input video data and the vehicle detector 140 detects an area having motion through optical flow detection. It is determined whether there is an area having motion in step 920 . If there is an area with motion detected, the process proceeds to step 925 .
  • the vehicle-tracking unit 155 predicts the target vehicle direction and the amount of target vehicle motion using information on the detected area in step 925 . In other words, the relative speed and motion information of the target vehicle are predicted based on the vehicle candidate area. Once the predicted information is acquired, it is compared with the actual information on the target vehicle to determine a tracking result.
  • the vehicle-tracking unit 155 determines whether any tracking error is within a predetermined range in step 930 to determine whether tracking is successful. If the error is within the predetermined range, the vehicle-tracking unit 155 recognizes that tracking is successful and continues tracking the moving target vehicle while transforming the direction and amount of motion of the target vehicle into its actual direction and distance. The actual direction and distance may be updated and displayed in real time on a screen of the vehicle detection/tracking device to allow a user to check the actual direction and distance.
  • a surrounding vehicle can be tracked by minimizing an influence from a surrounding environment such as a night, rainy, or snowy environment.
  • the present invention can be applied to products capable of preventing collision with front or rear vehicles during driving and determining whether surrounding vehicles are within a dangerous distance.

Abstract

Provided is a vehicle detection/tracking device that allows vehicle detection and then tracking using a camera focusing around a camera-mounted vehicle. To this end, an optical flow is acquired from video data input through a camera mounted in a vehicle during driving of the vehicle. A background optical flow, an optical flow generated by driving the vehicle, is acquired. The acquired optical flow and the acquired background optical flow are compared, and a vehicle candidate area is detected through an optical flow for an object around the vehicle. In particular, template matching is performed on the vehicle candidate area for vehicle detection and a detected vehicle can be tracked by continuously comparing the detected optical flow and the background optical flow.

Description

    PRIORITY
  • This application claims priority under 35 U.S.C. § 119 to an application entitled “Vehicle-monitoring Device and Method Using Optical Flow” filed in the Korean Intellectual Property Office on Dec. 28, 2004 and assigned Ser. No. 2004-114083, the contents of which are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention generally relates to a vehicle detection/tracking method and device, and in particular, to a vehicle detection/tracking method and device using an optical flow, in which real-time vehicle detection is accomplished by calculating an optical flow using a vehicle-mounted camera.
  • 2. Description of the Related Art
  • Conventionally, a vehicle motion is detected using a shadow cast by the vehicle or by horizontal edges of the vehicle in a photographed image.
  • Shadow detection involves designating a vehicle candidate area using a shadow where the vehicle meets a road and detecting the vehicle by checking symmetry of the designated vehicle candidate area. Detection by photograph involves acquiring horizontal edges from a photographed image of the vehicle, designating an area where a predetermined portion of the horizontal edges appears as a vehicle candidate area, and detecting the vehicle using the designated vehicle candidate area and vehicle templates.
  • However, shadow detection is unreliable at night or in the rain, because the vehicle candidate area is reduced due to the dark and wet conditions. Moreover, since the shadow cast in the morning or evening is long because of the position of the sun, performance is degraded by time zones. In addition, the shadows of two parallel vehicles may overlap on the road surface. As a result, shadow detection may detect two vehicles as one vehicle. Thus, shadow detection has difficulty handling a new vehicle that is not subject to tracking.
  • The horizontal-edge-photograph detection uses information about the color or area of the road, and is not subject to the problems of shadow detection, but detection performance degrades on a curved or uphill/downhill road.
  • As described above, conventional vehicle detection performance is degraded by changes in a shadow, the parallel running of vehicles, and weather conditions.
  • SUMMARY OF THE INVENTION
  • It is, therefore, an object of the present invention to provide a vehicle detection/tracking method and device using an optical flow, in which a target vehicle is detected in real time by calculating an optical flow using a camera mounted in a source vehicle and tracking the detected target vehicle.
  • To achieve the above and other objects, there is provided a vehicle-monitoring device using an optical flow. The vehicle-monitoring device includes a video pre-processor, an optical flow detector, and a vehicle detector. The video pre-processor processes video data input through a camera. The optical flow detector detects an optical flow from the input video data and separates a background optical flow area from the detected optical flow. The vehicle detector matches the remaining area of the detected optical flow except for the background optical flow area to templates to determine whether the remaining area is an area of a target vehicle.
  • To achieve the above and other objects, there is also provided a vehicle-monitoring method using an optical flow. The vehicle-monitoring method includes pre-processing video data input through a camera, detecting an optical flow from the input video data and separating a background optical flow from the detected optical flow, matching the remaining area of the detected optical flow except for the background optical flow to templates, and detecting information on a target vehicle if the remaining area is an area having motions of the target vehicle and detecting the target vehicle.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other objects, features and advantages of the present invention will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings in which:
  • FIG. 1 is a block diagram of a vehicle-monitoring device having vehicle detection/tracking functions according to an embodiment of the present invention;
  • FIG. 2 illustrates an input image and an optical flow of the input image according to an embodiment of the present invention;
  • FIG. 3 illustrates a change in an optical flow according to a change in speed of a source vehicle according to an embodiment of the present invention;
  • FIG. 4 illustrates a change in an optical flow according to a change in a steering wheel angle of a source vehicle according to an embodiment of the present invention;
  • FIG. 5 is a view for explaining a process of extracting an optical flow for a target vehicle according to an embodiment of the present invention;
  • FIG. 6 illustrates templates for various vehicle types according to an embodiment of the present invention;
  • FIG. 7A is a view for calculating motion information according to a vector direction according to an embodiment of the present invention;
  • FIG. 7B is a view for calculating a relative speed according to a vector size according to an embodiment of the present invention;
  • FIG. 8 is a control flowchart for vehicle detection according to an embodiment of the present invention; and
  • FIG. 9 is a control flowchart for vehicle tracking according to an embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • Preferred embodiments of the present invention will now be described in detail with reference to the annexed drawings. In the following description, a detailed description of known functions and configurations incorporated herein has been omitted for conciseness.
  • The present invention provides a vehicle detection/tracking device that allows vehicle detection and then tracking using a camera focusing around a source vehicle having the camera mounted therein. To this end, in the present invention, an optical flow is acquired from video data that is input through a camera mounted in the source vehicle while driving the source vehicle. Then, a background optical flow generated from driving is acquired, the optical flow and the background optical flow are compared, and a vehicle candidate area is detected using an optical flow of motions of objects around the source vehicle which is acquired as the result of the comparison. In particular, in the present invention, a target vehicle is detected by matching a vehicle candidate area to vehicle templates and is then tracked by continuously comparing an optical flow and a background optical flow for the detected target vehicle.
  • The optical flow represents apparent motions between two temporally different video data that are photographed and input from a camera as vectors. A vehicle detection method using optical flow involves comparing pixels of a previously photographed frame and pixels of a currently photographed frame. Alternatively, the method may involve dividing a previous image into a plurality of unit blocks, each having a predetermined number of pixels, dividing a current image into a plurality of unit blocks, each having the same size as a unit block of the previous image, and comparing the unit blocks of the previous image with the unit blocks of the current image while moving the current image pixel-by-pixel to calculate differences between luminance or chrominance values of the previous image pixels and the current image pixels, and expressing previous image pixel motion using vectors based on the calculated differences, and if a vector having a size that is more than a specific value is generated in a specific area, detecting a target vehicle using the vector.
  • Hereinafter, functions of components of a vehicle detection/tracking device according to an embodiment of the present invention will be described, and a device having vehicle detection/tracking functions according to an embodiment of the present invention will be referred to as a vehicle-monitoring device. FIG. 1 is a block diagram of a vehicle-monitoring device according to an embodiment of the present invention.
  • Referring to FIG. 1, the vehicle-monitoring device 100 detects a moving target vehicle by analyzing an optical flow of video data input through a camera 105 mounted in a source vehicle and tracks the detected target vehicle.
  • The vehicle-monitoring device 100 includes a video pre-processor 110 that pre-processes an image input through the camera 105, an optical flow detector 125 that detects an optical flow for detection of a motion of a target vehicle from the input image, a vehicle detector 140 that detects a vehicle candidate area for vehicle detection using the detected optical flow, and a vehicle tracking unit 155 that tracks the detected target vehicle using information about the detected target vehicle.
  • The camera 105 photographs a monitoring area and outputs video data that takes the form of an analog signal. In other words, the camera 105 outputs video data to the video pre-processor 110. A video input unit 115 of the video pre-processor 110 digitizes the input video data to acquire information about a moving target vehicle from the entire video data. A video corrector 120 filters the digital video data to remove noise. The video corrector 120 includes a filter for removing noise, e.g., a noise removing filter from the input video data, such as a Gaussian filter or a median filter. Thus, the video corrector 120 corrects the input video data through filtering.
  • The optical flow detector 125 detects target vehicle motion and optical flow to separate an area having motions and a background from the video data. In other words, the optical flow detector 125 detects an optical flow to separate a moving target vehicle and a stationary background from the entire video data,. Specifically, an optical flow calculator 130 of the optical flow detector 125 calculates an optical flow of the whole video data to separate a moving target vehicle and a background that is stationary with respect to the fixed camera 105. There are various methods for calculating an optical flow. In the present invention, a Lukas & Kanade method as described in Equation (1) below will be used: [ I x I y ] [ u v ] = - I l ( 1 )
    where Ix represents partial differentiation with respect to pixels in x coordinates, Iy represents partial differentiation with respect to the pixels in y coordinates, and u and v represent an input image coordinate system. Using Equation (1), partial differentiation It on the pixels with respect to time t is acquired.
  • For example, upon input of an image as shown in FIG. 2A, the optical flow detector 125 detects an optical flow as shown in FIG. 2B through a process that uses Equation (1). In the image shown in FIG. 2A, a target vehicle 200 is moving. Optical flow for the moving target vehicle 200 is detected in the form of vectors 210 as shown in FIG. 2B.
  • As shown in FIG. 2B, the optical flow is expressed as vectors. The area of the target vehicle 200, with smaller vectors and the background, is shown with larger vectors along the direction of motion of the target vehicle 200. The sizes of the vectors of the optical flow change according to the speed of the moving target vehicle 200, but the pattern of the vectors 210 is constant.
  • While driving a source vehicle, a pattern of an optical flow for a moving target vehicle and a pattern of an optical flow for a stationary background are generated as shown in FIG. 2B. In particular, while a source vehicle is driven, a pattern of an optical flow for a background is maintained constant so that another pattern of an optical flow for another area, except for an area displaying the pattern, is displayed. Thus, an optical flow for the other area takes another pattern that is different from that of the background. An optical flow analyzer 135 stores information about the area for which the optical flow takes another pattern, i.e., information about a vehicle candidate area, in units of a square area, i.e., in units of a pixel. In this way, by separating an optical flow for a moving target vehicle from an optical flow for a background, all area other than the optical flow for the background is designated as a vehicle candidate area.
  • The optical flow analyzer 135 detects the area except for the background from the optical flow, which is calculated by the optical flow calculator 130 using Equation (1). To separate the optical flow for the background according to an embodiment of the present invention, the optical flow analyzer 135 refers to an optical flow lookup table. The optical flow lookup table tabulates optical flows mapped to steering wheel angles and speeds of a source vehicle, stored in the form of a database.
  • For example, FIG. 3 illustrates a change in an optical flow according to a change in a speed of a source vehicle according to an embodiment of the present invention, and FIG. 4 illustrates a change in an optical flow according to a change in a steering wheel angle of the source vehicle according to the present invention. Referring to FIG. 3, in an optical flow directed outward from its center, vectors inside a dotted circle in FIG. 3A are expressed larger than those in FIG. 3B as the speed of the source vehicle increases. Each optical flow is mapped to a corresponding speed of the source vehicle.
  • In addition, as shown in FIG. 4, an optical flow is also expressed differently according to a steering wheel angle of the source vehicle. The optical flow lookup table includes not only an optical flow according to either the steering wheel angle of the source vehicle or the speed of the source vehicle, but also an optical flow according to both the steering wheel angle of the source vehicle and the speed of the source vehicle. Such an optical flow according to the steering wheel angle and/or speed of the source vehicle will be referred to as a background optical flow.
  • Once the vehicle candidate area is designated, the vehicle detector 140 extracts an area by removing the background optical flow to detect the target vehicle in the area. In other words, the vehicle detector 140 performs vehicle detection including determination of an actual type of the moving target vehicle once the vehicle candidate area is designated.
  • More specifically, a candidate area detector 145 of the vehicle detector 140 separates the moving target vehicle optical flow and the background optical flow the video data and performs an operation for removing the background optical flow. The candidate area detector 145 searches in the optical flow lookup table for the background optical flow corresponding to current speed and steering wheel angle of the source vehicle and extracts the background optical flow by finding the matching optical flows in the lookup table.
  • In other words, the candidate area detector 145 compares an optical flow of the entire video data with the background optical flow corresponding to current speed and steering wheel angle of the source vehicle using the optical flow lookup table, hereinafter referred to as matching. If there is an area matched to the background optical flow as the result of the comparison, the optical flow for the target vehicle remains after removing the matched area from the optical flow of the entire video data.
  • FIG. 5 is a view for explaining a process of extracting an optical flow for a target vehicle according to an embodiment of the present invention. More specifically, FIG. 5A illustrates optical flow of the entire video data of FIG. 2B. The optical flow of FIG. 5A is divided into a background optical flow 500 and an optical flow 510 for the target vehicle.
  • The candidate area detector 145 searches the optical flow lookup table for an optical flow corresponding to the current speed and steering wheel angle of the source vehicle and detects an optical flow 520 as shown in FIG. 5B. The candidate area detector 145 matches an optical flow of the entire video data to a background optical flow 520 of FIG. 5B to determine whether the entire video data includes an area that matches the background optical flow 520. Thus, if the background optical flow 500 is similar to the background optical flow 520, the candidate area detector 145 removes the background optical flow 500. In this way, the background optical flow is removed from the optical flow of the entire video data and only the optical flow 510 for the target vehicle remains, thereby detecting the vehicle candidate area.
  • Once only the vehicle candidate area remains, a template-matching unit 150 examines the correlation between the vehicle candidate area and previously created templates for various vehicle types for vehicle candidate detection.
  • Here, vehicle detection using templates for various vehicle types involves previously preparing an image of each vehicle type, creating templates with an average image of each vehicle type, and detecting a vehicle using the created templates.
  • For example, referring to FIG. 6, templates are shown, each of which is an average image of each vehicle type. In FIG. 6, a van template 600 acquired by averaging images of vans is shown. Similarly, an SUV template 610, an automobile template 620, a bus template 630, and a truck template 640 are shown in FIG. 6. In other words, since there are various kinds of buses, the bus template 630 is acquired by averaging images of the buses.
  • The template-matching unit 150 compares the detected vehicle candidate area with templates as shown in FIG. 6 to determine whether the detected vehicle candidate area corresponds to an actual image of the target vehicle. In the foregoing description, template matching is used to detect the actual target vehicle from the vehicle candidate area. However, Principle Component Analysis (PCA) transformation or Support Vector Machine (SVM) may be used. As such, a result from the template-matching unit 150 indicates not only whether the vehicle candidate area corresponds to an actual image of the target vehicle but also the type of the target vehicle.
  • Through the foregoing process, a vehicle candidate area is detected from the entire video data, the target vehicle is recognized through template matching in the detected vehicle candidate area, and the type of the target vehicle is determined.
  • After completion of vehicle detection, vehicle tracking is available. The vehicle-tracking unit 155 performs the vehicle tracking. To track a moving target vehicle, the vehicle-tracking unit 155 predicts a next position of the target vehicle using a prediction algorithm.
  • More specifically, a vehicle information unit 160 of the vehicle tracking unit 155 predicts the position of the target vehicle in the next frame based on vehicle information output from the template matching unit 150, i.e., information on a current image of the moving target vehicle and information on a previous image of the moving target vehicle. In other words, the vehicle information unit 160 compares information on a vehicle area in the previous image and a vehicle prediction area in the current image through optical flow detection.
  • At this time, the vehicle information unit 160 predicts the target vehicle position by calculating relative speed and motion information of the target vehicle. FIG. 7A is a view for calculating motion information according to a vector direction according to an embodiment of the present invention. FIG. 7B is a view for calculating a relative speed according to a vector size according to an embodiment of the present invention.
  • Referring to FIG. 7A, the vehicle information unit 160 recognizes that a relative distance between the target vehicle and the source vehicle increases if the vector 700 is directed upward. On the other hand, if the vector 710 is directed downward, the vehicle information unit 160 recognizes that a relative distance between the target vehicle and the source vehicle decreases.
  • Referring to FIG. 7B, the vehicle information unit 160 recognizes that a relative speed between the target vehicle and the source vehicle increases if a vector size increases. On the other hand, if the vector size decreases, the vehicle information unit 160 recognizes that the relative speed decreases. Thus, the vehicle information unit 160 calculates the relative speed and motion information of the target vehicle to predict the target vehicle position.
  • A tracking state determining unit 165 determines the accuracy of the tracking by comparing information on the actual position of the target vehicle in the next frame with the predicted information on the target vehicle. If there is an error within a predetermined range, vehicle tracking continues. Otherwise, information on the target vehicle is acquired again from the input video data and vehicle tracking is automatically performed again. When a new vehicle area is generated, the tracking state determining unit 165 also acquires information on the new target vehicle and tracks the new target vehicle through detection of the new vehicle area.
  • FIG. 8 is a control flowchart for vehicle detection according to an embodiment of the present invention.
  • First, the vehicle-monitoring device photographs a monitoring area through the camera 105 in step 800. The vehicle-monitoring device then pre-processes input video data acquired through the photographs. The vehicle-monitoring device detects an optical flow from the pre-processed input video data in step 805 and proceeds to step 810 to compare the detected optical flow with a background optical flow in the optical flow lookup table. The vehicle-monitoring device proceeds to step 815 to determine whether there is an area that matches the background optical flow in the optical flow of the entire video data. If there is a match, only the area matched to the background optical flow is removed from the entire image in step 820. Thus, a vehicle candidate area is detected in step 825 and is matched to previously stored vehicle templates in step 830.
  • After template matching, the vehicle-monitoring device proceeds to step 835 to detect information on whether the detected vehicle candidate area corresponds to an actual image of the target vehicle, or to detect vehicle information, such target vehicle type. In this way, vehicle detection is performed on the entire video data input through the camera 105.
  • FIG. 9 is a control flowchart for vehicle tracking according to an embodiment of the present invention.
  • First, once the camera 105 photographs a monitoring area in step 905, the video pre-processor 110 pre-processes input video data acquired from the photographs in step 910. The optical flow detector 125 detects an optical flow from the pre-processed input video data and the vehicle detector 140 detects an area having motion through optical flow detection. It is determined whether there is an area having motion in step 920. If there is an area with motion detected, the process proceeds to step 925. The vehicle-tracking unit 155 predicts the target vehicle direction and the amount of target vehicle motion using information on the detected area in step 925. In other words, the relative speed and motion information of the target vehicle are predicted based on the vehicle candidate area. Once the predicted information is acquired, it is compared with the actual information on the target vehicle to determine a tracking result.
  • The vehicle-tracking unit 155 determines whether any tracking error is within a predetermined range in step 930 to determine whether tracking is successful. If the error is within the predetermined range, the vehicle-tracking unit 155 recognizes that tracking is successful and continues tracking the moving target vehicle while transforming the direction and amount of motion of the target vehicle into its actual direction and distance. The actual direction and distance may be updated and displayed in real time on a screen of the vehicle detection/tracking device to allow a user to check the actual direction and distance.
  • As described above, according to the present invention, a surrounding vehicle can be tracked by minimizing an influence from a surrounding environment such as a night, rainy, or snowy environment. In addition, the present invention can be applied to products capable of preventing collision with front or rear vehicles during driving and determining whether surrounding vehicles are within a dangerous distance.
  • While the invention has been shown and described with reference to a certain preferred embodiment thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.

Claims (18)

1. A vehicle-monitoring device using an optical flow, the vehicle-monitoring device comprising:
a video pre-processor for pre-processing video data input through a camera;
an optical flow detector for detecting an optical flow from the video data and separating a background optical flow area from the detected optical flow; and
a vehicle detector for matching a remaining optical flow area; except for the background optical flow area, to templates to determine whether the remaining optical flow area is an area of a target vehicle.
2. The vehicle-monitoring device of claim 1, wherein the vehicle detector performs vehicle detection by determining a type of the target vehicle when the remaining optical flow area is the area of the target vehicle.
3. The vehicle-monitoring device of claim 1, wherein the video pre-processor comprises:
a video input unit for digitizing the video data; and
a video corrector for performing filtering to remove noise from the video data.
4. The vehicle-monitoring device of claim 1, wherein the optical flow detector comprises:
an optical flow calculator for calculating an optical flow of the video data; and
an optical flow analyzer for storing information about an area that shows a pattern different from a stationary area optical flow pattern.
5. The vehicle-monitoring device of claim 4, wherein the optical flow analyzer extracts a background optical flow from the video data by referring to an optical flow lookup table that stores background optical flows mapped to at least one of a vehicle steering wheel angle and speed information.
6. The vehicle-monitoring device of claim 1, wherein the vehicle detector comprises:
a candidate area detector for searching for a background optical flow corresponding to speed and steering wheel angle information of a source vehicle stored in an optical flow lookup table and extracting as a vehicle candidate area the area remaining after removing a background from the video data through matching; and
a template matching unit for examining correlation between the vehicle candidate area and previously stored templates for various vehicles types to determine whether the detected vehicle candidate area corresponds to an actual image of the target vehicle or a type of the target vehicle.
7. The vehicle-monitoring device of claim 1, further comprising a vehicle-tracking unit for tracking the target vehicle using information on the detected target vehicle.
8. The vehicle-monitoring device of claim 7, wherein the vehicle-tracking unit further comprises:
a vehicle storing unit for predicting a position in a next frame to which the target vehicle detected by the vehicle detector is to be moved based on information on a current image of the target vehicle and information on a previous image of the target vehicle; and
a tracking state determining unit for determining a result of tracking by comparing information on the target vehicle actually acquired from the next frame with predicted information on the target vehicle.
9. The vehicle-monitoring device of claim 8, wherein the vehicle detector continues tracking the target vehicle if an error of tracking is within a predetermined range or repeats automatic tracking if the error of tracking is outside the predetermined range.
10. A vehicle-monitoring method using an optical flow, the vehicle-monitoring method comprising the steps of:
pre-processing video data;
detecting an optical flow from the video data and separating a background optical flow from the detected optical flow;
matching a remaining optical flow area, except for the background optical flow area, to templates; and
detecting information on a target vehicle if the remaining optical flow area is an area having motion of the target vehicle; and,
detecting the target vehicle.
11. The vehicle-monitoring method of claim 10, wherein the step of separating the background optical flow comprises the steps of:
calculating an optical flow of the video data; and
comparing the calculated optical flow with a corresponding background optical flow previously stored in an optical flow lookup table.
12. The vehicle-monitoring method of claim 11, wherein the optical flow lookup table stores background optical flows that vary with speed and steering wheel angle information of a source vehicle having a camera mounted therein.
13. The vehicle-monitoring method of claim 11, wherein the step of comparing the calculated optical flow with the background optical flow comprises:
searching for a corresponding background optical flow among previously stored background optical flows according to speed and steering wheel angle information of a source vehicle; and
extracting the corresponding background optical flow from the input image if a corresponding background optical flow is found.
14. The vehicle-monitoring method of claim 10, wherein the step of matching the remaining optical flow area, except for the background optical flow area, to templates comprises:
extracting the area remaining after removal of a background as a vehicle candidate area;
examining correlation between the vehicle candidate area and previously stored templates of various types of vehicles; and
determining whether the detected vehicle candidate area is an actual image of the target vehicle or a type of the target vehicle.
15. The vehicle-monitoring method of claim 10, further comprising the step of tracking the target vehicle using information on the target vehicle.
16. The vehicle-monitoring method of claim 15, wherein the step of tracking the target vehicle comprises:
predicting a position in a next frame to which the target vehicle is to be moved based on a current image of the target vehicle and information on a previous image of the target vehicle; and
determining a result of tracking by comparing information on the target vehicle in the next frame with predicted information on the target vehicle.
17. The vehicle-monitoring method of claim 16, wherein the step of predicting the position in the next frame comprises the step of calculating relative speed and motion information of the target vehicle.
18. The vehicle-monitoring method of claim 10, wherein the video data is input by a camera.
US11/236,236 2004-12-28 2005-09-27 Vehicle-monitoring device and method using optical flow Abandoned US20060140447A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020040114083A KR100630088B1 (en) 2004-12-28 2004-12-28 Apparatus and method for supervising vehicle using optical flow
KR114083/2004 2004-12-28

Publications (1)

Publication Number Publication Date
US20060140447A1 true US20060140447A1 (en) 2006-06-29

Family

ID=36611565

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/236,236 Abandoned US20060140447A1 (en) 2004-12-28 2005-09-27 Vehicle-monitoring device and method using optical flow

Country Status (2)

Country Link
US (1) US20060140447A1 (en)
KR (1) KR100630088B1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070070201A1 (en) * 2005-09-29 2007-03-29 Matsushita Electric Industrial Co., Ltd. Object tracking method and object tracking apparatus
US20090125223A1 (en) * 2006-03-31 2009-05-14 Higgins Robert P Video navigation
US20100080419A1 (en) * 2008-09-30 2010-04-01 Mazda Motor Corporation Image processing device for vehicle
US20100177936A1 (en) * 2007-07-05 2010-07-15 Julia Ebling Device for identifying and/or classifying movement patterns in an image sequence of a surveillance scene, method and computer program
EP2549457A1 (en) * 2010-03-17 2013-01-23 Hitachi Automotive Systems, Ltd. Vehicle-mounting vehicle-surroundings recognition apparatus and vehicle-mounting vehicle-surroundings recognition system
US20130215270A1 (en) * 2012-02-16 2013-08-22 Fujitsu Ten Limited Object detection apparatus
US20140071286A1 (en) * 2012-09-13 2014-03-13 Xerox Corporation Method for stop sign law enforcement using motion vectors in video streams
US20140376769A1 (en) * 2013-06-20 2014-12-25 Xerox Corporation Method for detecting large size and passenger vehicles from fixed cameras
US20150063647A1 (en) * 2013-09-05 2015-03-05 Hyundai Motor Company Apparatus and method for detecting obstacle
US20150063630A1 (en) * 2013-08-27 2015-03-05 Hyundai Motor Company Apparatus and method for detecting obstacle
US9104920B2 (en) 2012-10-30 2015-08-11 Hyundai Motor Company Apparatus and method for detecting obstacle for around view monitoring system
US20160093052A1 (en) * 2014-09-26 2016-03-31 Neusoft Corporation Method and apparatus for detecting obstacle based on monocular camera
EP2121405B1 (en) * 2007-01-18 2016-11-30 Siemens Corporation System and method for vehicle detection and tracking
JP2017041132A (en) * 2015-08-20 2017-02-23 株式会社Jvcケンウッド Vehicle detection device, vehicle detection system, vehicle detection method, and vehicle detection program
US20170124401A1 (en) * 2015-10-29 2017-05-04 Samsung Sds Co., Ltd. System and method for searching location of object
CN106909936A (en) * 2017-01-21 2017-06-30 江苏大学 A kind of vehicle checking method based on double vehicle deformable part models
US20170345164A1 (en) * 2015-02-16 2017-11-30 Applications Solutions (Electronic and Vision) Ltd Method and device for the estimation of car ego-motion from surround view images
CN109983469A (en) * 2016-11-23 2019-07-05 Lg伊诺特有限公司 Use the image analysis method of vehicle drive information, device, the system and program and storage medium
JP2019161486A (en) * 2018-03-14 2019-09-19 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Dynamic body detection device, control device, moving body, dynamic body detection method, and program
US20190294892A1 (en) * 2017-04-22 2019-09-26 Honda Motor Co., Ltd. System and method for remapping surface areas of a vehicle environment
CN111351474A (en) * 2018-12-24 2020-06-30 上海欧菲智能车联科技有限公司 Vehicle moving target detection method, device and system
CN112037250A (en) * 2020-07-27 2020-12-04 国网四川省电力公司 Target vehicle vector trajectory tracking and engineering view modeling method and device
US10878259B2 (en) 2018-10-17 2020-12-29 Automotive Research & Testing Center Vehicle detecting method, nighttime vehicle detecting method based on dynamic light intensity and system thereof
US20210279890A1 (en) * 2018-11-27 2021-09-09 Omnivision Sensor Solution (Shanghai) Co., Ltd Target tracking method and computing device

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102359083B1 (en) * 2015-06-02 2022-02-08 에스케이하이닉스 주식회사 Device for detecting moving object and method thereof
DE102016213495A1 (en) * 2016-07-22 2018-01-25 Robert Bosch Gmbh Driver assistance system, driver assistance system and vehicle
KR101976952B1 (en) 2017-11-01 2019-05-09 재단법인 다차원 스마트 아이티 융합시스템 연구단 System and method for detecting object using motion vector
KR102219906B1 (en) * 2018-12-20 2021-02-24 한동대학교 산학협력단 Method and apparatus for automatically generating learning data for machine learning

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5530420A (en) * 1993-12-27 1996-06-25 Fuji Jukogyo Kabushiki Kaisha Running guide apparatus for vehicle capable of keeping safety at passing through narrow path and the method thereof
US6466684B1 (en) * 1998-09-14 2002-10-15 Yazaki Corporation Environment monitoring system
US6535114B1 (en) * 2000-03-22 2003-03-18 Toyota Jidosha Kabushiki Kaisha Method and apparatus for environment recognition
US7190282B2 (en) * 2004-03-26 2007-03-13 Mitsubishi Jidosha Kogyo Kabushiki Kaisha Nose-view monitoring apparatus
US7248718B2 (en) * 2004-02-19 2007-07-24 Siemens Corporate Research, Inc. System and method for detecting a passing vehicle from dynamic background using robust information fusion
US7266220B2 (en) * 2002-05-09 2007-09-04 Matsushita Electric Industrial Co., Ltd. Monitoring device, monitoring method and program for monitoring

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3395634B2 (en) 1998-03-12 2003-04-14 三菱自動車工業株式会社 Optical flow type backward information detection device
KR100364582B1 (en) 2000-04-28 2002-12-16 주식회사 네트웍코리아 System tracking and watching multi moving object
JP3888055B2 (en) 2000-12-26 2007-02-28 財団法人鉄道総合技術研究所 Train forward anomaly detection device using optical flow

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5530420A (en) * 1993-12-27 1996-06-25 Fuji Jukogyo Kabushiki Kaisha Running guide apparatus for vehicle capable of keeping safety at passing through narrow path and the method thereof
US6466684B1 (en) * 1998-09-14 2002-10-15 Yazaki Corporation Environment monitoring system
US6535114B1 (en) * 2000-03-22 2003-03-18 Toyota Jidosha Kabushiki Kaisha Method and apparatus for environment recognition
US7266220B2 (en) * 2002-05-09 2007-09-04 Matsushita Electric Industrial Co., Ltd. Monitoring device, monitoring method and program for monitoring
US7248718B2 (en) * 2004-02-19 2007-07-24 Siemens Corporate Research, Inc. System and method for detecting a passing vehicle from dynamic background using robust information fusion
US7190282B2 (en) * 2004-03-26 2007-03-13 Mitsubishi Jidosha Kogyo Kabushiki Kaisha Nose-view monitoring apparatus

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009510541A (en) * 2005-09-29 2009-03-12 パナソニック株式会社 Object tracking method and object tracking apparatus
US7860162B2 (en) * 2005-09-29 2010-12-28 Panasonic Corporation Object tracking method and object tracking apparatus
US20070070201A1 (en) * 2005-09-29 2007-03-29 Matsushita Electric Industrial Co., Ltd. Object tracking method and object tracking apparatus
US8666661B2 (en) * 2006-03-31 2014-03-04 The Boeing Company Video navigation
US20090125223A1 (en) * 2006-03-31 2009-05-14 Higgins Robert P Video navigation
EP2121405B1 (en) * 2007-01-18 2016-11-30 Siemens Corporation System and method for vehicle detection and tracking
US20100177936A1 (en) * 2007-07-05 2010-07-15 Julia Ebling Device for identifying and/or classifying movement patterns in an image sequence of a surveillance scene, method and computer program
US8761436B2 (en) * 2007-07-05 2014-06-24 Robert Bosch Gmbh Device for identifying and/or classifying movement patterns in an image sequence of a surveillance scene, method and computer program
US20100080419A1 (en) * 2008-09-30 2010-04-01 Mazda Motor Corporation Image processing device for vehicle
US8259998B2 (en) * 2008-09-30 2012-09-04 Mazda Motor Corporation Image processing device for vehicle
EP2549457A4 (en) * 2010-03-17 2014-06-18 Hitachi Automotive Systems Ltd Vehicle-mounting vehicle-surroundings recognition apparatus and vehicle-mounting vehicle-surroundings recognition system
EP2549457A1 (en) * 2010-03-17 2013-01-23 Hitachi Automotive Systems, Ltd. Vehicle-mounting vehicle-surroundings recognition apparatus and vehicle-mounting vehicle-surroundings recognition system
JP2013168062A (en) * 2012-02-16 2013-08-29 Fujitsu Ten Ltd Object detection device and object detection method
US20130215270A1 (en) * 2012-02-16 2013-08-22 Fujitsu Ten Limited Object detection apparatus
US10018703B2 (en) * 2012-09-13 2018-07-10 Conduent Business Services, Llc Method for stop sign law enforcement using motion vectors in video streams
US20140071286A1 (en) * 2012-09-13 2014-03-13 Xerox Corporation Method for stop sign law enforcement using motion vectors in video streams
US9104920B2 (en) 2012-10-30 2015-08-11 Hyundai Motor Company Apparatus and method for detecting obstacle for around view monitoring system
US20140376769A1 (en) * 2013-06-20 2014-12-25 Xerox Corporation Method for detecting large size and passenger vehicles from fixed cameras
US9317752B2 (en) * 2013-06-20 2016-04-19 Xerox Corporation Method for detecting large size and passenger vehicles from fixed cameras
CN104417454A (en) * 2013-08-27 2015-03-18 现代自动车株式会社 Device and method for detecting obstacles
US20150063630A1 (en) * 2013-08-27 2015-03-05 Hyundai Motor Company Apparatus and method for detecting obstacle
US9418443B2 (en) * 2013-08-27 2016-08-16 Hyundai Motor Company Apparatus and method for detecting obstacle
US9183449B2 (en) * 2013-09-05 2015-11-10 Hyundai Motor Company Apparatus and method for detecting obstacle
US20150063647A1 (en) * 2013-09-05 2015-03-05 Hyundai Motor Company Apparatus and method for detecting obstacle
US20160093052A1 (en) * 2014-09-26 2016-03-31 Neusoft Corporation Method and apparatus for detecting obstacle based on monocular camera
US9521317B2 (en) * 2014-09-26 2016-12-13 Neusoft Corporation Method and apparatus for detecting obstacle based on monocular camera
US10867401B2 (en) * 2015-02-16 2020-12-15 Application Solutions (Electronics and Vision) Ltd. Method and device for the estimation of car ego-motion from surround view images
US20170345164A1 (en) * 2015-02-16 2017-11-30 Applications Solutions (Electronic and Vision) Ltd Method and device for the estimation of car ego-motion from surround view images
JP2017041132A (en) * 2015-08-20 2017-02-23 株式会社Jvcケンウッド Vehicle detection device, vehicle detection system, vehicle detection method, and vehicle detection program
US20170124401A1 (en) * 2015-10-29 2017-05-04 Samsung Sds Co., Ltd. System and method for searching location of object
CN109983469A (en) * 2016-11-23 2019-07-05 Lg伊诺特有限公司 Use the image analysis method of vehicle drive information, device, the system and program and storage medium
US20190279374A1 (en) * 2016-11-23 2019-09-12 Lg Innotek Co., Ltd. Image analysis method, device, system, and program, which use vehicle driving information, and storage medium
US10909693B2 (en) * 2016-11-23 2021-02-02 Lg Innotek Co., Ltd. Image analysis method, device, system, and program, which use vehicle driving information, and storage medium
CN106909936A (en) * 2017-01-21 2017-06-30 江苏大学 A kind of vehicle checking method based on double vehicle deformable part models
US20190294892A1 (en) * 2017-04-22 2019-09-26 Honda Motor Co., Ltd. System and method for remapping surface areas of a vehicle environment
US10970563B2 (en) * 2017-04-22 2021-04-06 Honda Motor Co., Ltd. System and method for remapping surface areas of a vehicle environment
JP2019161486A (en) * 2018-03-14 2019-09-19 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Dynamic body detection device, control device, moving body, dynamic body detection method, and program
CN110392891A (en) * 2018-03-14 2019-10-29 深圳市大疆创新科技有限公司 Mobile's detection device, control device, moving body, movable body detecting method and program
US10878259B2 (en) 2018-10-17 2020-12-29 Automotive Research & Testing Center Vehicle detecting method, nighttime vehicle detecting method based on dynamic light intensity and system thereof
US20210279890A1 (en) * 2018-11-27 2021-09-09 Omnivision Sensor Solution (Shanghai) Co., Ltd Target tracking method and computing device
US11657516B2 (en) * 2018-11-27 2023-05-23 Omnivision Sensor Solution (Shanghai) Co., Ltd Target tracking method and computing device
CN111351474A (en) * 2018-12-24 2020-06-30 上海欧菲智能车联科技有限公司 Vehicle moving target detection method, device and system
CN112037250A (en) * 2020-07-27 2020-12-04 国网四川省电力公司 Target vehicle vector trajectory tracking and engineering view modeling method and device

Also Published As

Publication number Publication date
KR20060075311A (en) 2006-07-04
KR100630088B1 (en) 2006-09-27

Similar Documents

Publication Publication Date Title
US20060140447A1 (en) Vehicle-monitoring device and method using optical flow
US7366325B2 (en) Moving object detection using low illumination depth capable computer vision
US7936903B2 (en) Method and a system for detecting a road at night
US6690011B2 (en) Infrared image-processing apparatus
US20080166018A1 (en) Method and apparatus for performing object recognition on a target detected using motion information
EP1837803A2 (en) Headlight, taillight and streetlight detection
JP2003346278A (en) Apparatus and method for measuring queue length of vehicles
US10878259B2 (en) Vehicle detecting method, nighttime vehicle detecting method based on dynamic light intensity and system thereof
JP2000030197A (en) Device for detecting warm body
EP2741234B1 (en) Object localization using vertical symmetry
US8559727B1 (en) Temporal coherence in clear path detection
JPH08329393A (en) Preceding vehicle detector
KR20150022076A (en) Image processing method for vehicle camera and image processing apparatus usnig the same
JPH11195127A (en) Method for recognizing white line and device therefor
JP5353531B2 (en) Vehicle light recognition device and program
JP2007172504A (en) Adhering matter detection device and adhering matter detection method
JPH0973529A (en) Two-wheeled vehicle monitor device
JPH0757200A (en) Method and device for recognizing travel course
CN115088248A (en) Image pickup apparatus, image pickup system, and image pickup method
JP3196559B2 (en) Line recognition method
JP3081660B2 (en) Distance detection method
JP2006107000A (en) Method and device for deciding image abnormality
KR102203277B1 (en) System and operating method of image processing system for vehicle
JPH10188198A (en) White line detector for vehicle
Ahmed et al. Design and development of image based lane warning and anti-collision detection system

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PARK, SANG-CHEOL;KIM, KWANG-SOO;PARK, KYONG-HA;REEL/FRAME:017037/0780

Effective date: 20050920

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION