WO2011056730A2 - Standoff range sense through obstruction radar system - Google Patents

Standoff range sense through obstruction radar system Download PDF

Info

Publication number
WO2011056730A2
WO2011056730A2 PCT/US2010/054775 US2010054775W WO2011056730A2 WO 2011056730 A2 WO2011056730 A2 WO 2011056730A2 US 2010054775 W US2010054775 W US 2010054775W WO 2011056730 A2 WO2011056730 A2 WO 2011056730A2
Authority
WO
WIPO (PCT)
Prior art keywords
obstruction
sense
antenna
radar
assembly
Prior art date
Application number
PCT/US2010/054775
Other languages
French (fr)
Other versions
WO2011056730A3 (en
Inventor
Vinh N. Adams
Robert Adams
Wesley H. Dwelly
Original Assignee
Vawd Applied Science & Technology Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vawd Applied Science & Technology Corporation filed Critical Vawd Applied Science & Technology Corporation
Priority to GB1209474.4A priority Critical patent/GB2488699B/en
Publication of WO2011056730A2 publication Critical patent/WO2011056730A2/en
Publication of WO2011056730A3 publication Critical patent/WO2011056730A3/en
Priority to IL219519A priority patent/IL219519A/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/04Display arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/887Radar or analogous systems specially adapted for specific applications for detection of concealed objects, e.g. contraband or weapons
    • G01S13/888Radar or analogous systems specially adapted for specific applications for detection of concealed objects, e.g. contraband or weapons through wall detection
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01QANTENNAS, i.e. RADIO AERIALS
    • H01Q1/00Details of, or arrangements associated with, antennas
    • H01Q1/12Supports; Mounting means
    • H01Q1/125Means for positioning
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01QANTENNAS, i.e. RADIO AERIALS
    • H01Q19/00Combinations of primary active antenna elements and units with secondary devices, e.g. with quasi-optical devices, for giving the antenna a desired directional characteristic
    • H01Q19/10Combinations of primary active antenna elements and units with secondary devices, e.g. with quasi-optical devices, for giving the antenna a desired directional characteristic using reflecting surfaces
    • H01Q19/12Combinations of primary active antenna elements and units with secondary devices, e.g. with quasi-optical devices, for giving the antenna a desired directional characteristic using reflecting surfaces wherein the surfaces are concave
    • H01Q19/13Combinations of primary active antenna elements and units with secondary devices, e.g. with quasi-optical devices, for giving the antenna a desired directional characteristic using reflecting surfaces wherein the surfaces are concave the primary radiating source being a single radiating element, e.g. a dipole, a slot, a waveguide termination
    • H01Q19/132Horn reflector antennas; Off-set feeding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • G01S13/522Discriminating between fixed and moving objects or between objects moving at different speeds using transmissions of interrupted pulse modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/581Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse modulated waves and based upon the Doppler effect resulting from movement of targets
    • G01S13/582Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse modulated waves and based upon the Doppler effect resulting from movement of targets adapted for simultaneous range and velocity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems

Definitions

  • Sense through obstruction radar systems allow users to gain actionable intelligence through obstructions such as building walls, walls, fences, and foliage. These radars may be used by the military, police, security, and firemen to provide a capability of detecting, locating, identifying, and classifying moving and stationary humans for rescue and clearing operations.
  • Sense through obstruction radars include a transmitter that transmits electromagnetic waves that are reflected by objects and are then detected by the radar's receiver. The transmitted waves interact with objects that change the properties of the returned waves. When an object is moving at a constant velocity, the returned wave is shifted in frequency, which is called the Doppler Effect. The larger the velocity, the larger the frequency shift. When the object is moving towards the radar the frequency of the returned wave is increased.
  • the frequency of the returned wave is decreased.
  • the target is not moving but is vibrating the returned signal exhibits frequency sidebands called micro-Doppler.
  • electromagnetic waves travel roughly at the speed of light, the round trip time from the radar to the target provides information on the range of the target.
  • some portion of the electromagnetic waves penetrates through obstructions such as walls, but the amplitude of the waves is attenuated.
  • the lower the frequency of the wave the less attenuation electromagnetic wave exhibits.
  • the difficulty of measuring micro- Doppler from human or animal life forms increases.
  • Radars usually operate in the frequency range of 300 MHz to 8 GHz to use the properties of electromagnetic waves that can penetrate through obstructions, while measuring Doppler and micro-Doppler effects of human or animal life-forms.
  • detecting a human or animal life-form behind an obstruction is difficult because the transmitted and reflected waves are both attenuated by the obstruction. This makes detecting the Doppler and micro-Doppler due to the human or animal life-form difficult, especially in the presence of noise that is inherent in a radar.
  • Another difficulty encountered when detecting slow moving and vibrating objects is that the frequency shifts and the signals are extremely small, making it difficult to detect these shifts in the presence of the stationary objects in the radars field of view, especially in the presence of noise.
  • a standoff range, sense-through-obstruction radar system is disclosed that is capable of detecting micro-Doppler, or life form signatures, and movements through obstructions at stand-off ranges and a method of displaying the target information over a live video feed of the area under surveillance.
  • the sense-through- obstruction radar system comprises an antenna assembly that includes a horn antenna and a reflector configured to reflect radio frequency (RF) energy to/from the horn antenna.
  • An antenna pointing assembly supports the antenna assembly. The antenna pointing assembly is configured to move the antenna assembly to point the antenna assembly toward an obstruction.
  • a sensor assembly is mounted to the antenna assembly so that the sensor assembly is aligned with the RF beam formed from the RF energy reflected from the reflector to the horn antenna.
  • the sensor assembly (e.g., a range finder and an electro- optical camera) is configured to detect the location of the obstruction and to provide information to assist pointing of the antenna assembly toward the obstruction by the antenna pointing assembly.
  • the radar system may include a radar computing device configured to direct movement of the antenna assembly by the antenna pointing assembly in response to the detection of the location of the obstruction by the sensor assembly.
  • range information is transmitted to the radar computing device, which is operable to utilize the range information to configure timing of transmit-receive cycles associated with the sense- through-obstruction radar system.
  • Track data corresponding to a filtered range/range-rate pair is also received from the radar computing device.
  • At least one track box may be superimposed over a real-time image that represents a field of view of the sense-through- obstruction radar system. The track box corresponds to the track data and represents a target detected by the sense-through-obstruction radar system.
  • FIG. 1 is a block diagram illustrating an example implementation of a standoff range sense through obstruction radar that is capable of detecting micro-Doppler, or life form signatures, and movements through obstructions at stand-off ranges and a method of displaying the target information over a live video feed of the area under surveillance.
  • FIG. 2 is a flow chart illustrating an example setup procedure for a sense through obstruction radar system shown in FIG. 1.
  • FIG. 3 is a flow chart illustrating an example radar data processing cycle for a sense through obstruction radar system shown in FIG. 1.
  • FIG. 4 is a block diagram illustrating an example implementation of the sense through obstruction radar shown in FIG. 1.
  • FIG. 5 is a perspective view depicting an example implementation of the sense through obstruction radar system shown in FIGS. 1 and 2, further illustrating the radar antenna, sensor assembly and azimuth and elevation gimbal used for pointing the antenna.
  • FIG. 6 is a diagrammatic pictorial representation illustrating the alignment of the sensor assembly and the radars RF beam for the sense through obstruction radar system shown in FIG. 1, wherein the sensor assembly is comprised of a range finder and an electro-optical camera, and the alignment of the range finder beam, the electro-optical cameras field of view (FOV), and the radars RF beam is shown.
  • the sensor assembly is comprised of a range finder and an electro-optical camera
  • FOV field of view
  • FIG. 7 is a flow chart illustrating an example radar data processing cycle for the sense through obstruction radar system shown in FIG. 1.
  • FIG. 8 is a diagrammatic pictorial representation illustrating a graphical user interface (GUI) suitable to operate the sense through obstruction radar system shown in FIG. 1.
  • GUI graphical user interface
  • Stand-off range sense through obstruction radars furnish enhanced capability to detect moving and stationary micro-Doppler, or life-form, signatures for rescue and clearing operations.
  • Typical obstructions include walls of buildings, foliage, and so forth, but could be any type of obstruction except for solid metal obstructions.
  • Stand-off range sense through obstruction radars can be used by the military, police, security, and firemen. Additionally, the radars can provide standoff range human biometric monitoring for medical personnel to help save lives (e.g., battlefield wounded). It is also desirable that these radars be able to detect very low velocity motion and small motion (also known as micro-Doppler), as exhibited by life-forms, in the presence of all the stationary objects, or clutter, that are in the radars field of view (FOV) and range of interest.
  • FOV field of view
  • these radars be capable of operating at stand-off ranges greater than or equal to at least twenty (20) meters either as a requirement of the application or to provide safety or stealth to the operators. Functionally, to be useful to the military, police, security, firemen, and medical personnel, it is desirable that these radars be easy to setup, operate, and present the target information in an easy to understand format to the operator.
  • a standoff range, sense-through-obstruction radar system is disclosed that is capable of detecting micro-Doppler, or life form signatures, and movements through obstructions at stand-off ranges and a method of displaying the target information over a live video feed of the area under surveillance.
  • the sense-through- obstruction radar system comprises an antenna assembly that includes a horn antenna and a reflector configured to reflect radio frequency (RF) energy to/from the horn antenna.
  • RF radio frequency
  • This horn and reflector pair constitutes a high gain antenna assembly that provides sufficient gain to enable the system to operate at stand-off ranges.
  • the high gain antenna is mounted to an antenna pointing assembly that is configured to point the antenna assembly towards the obstruction of interest.
  • a sensor assembly which may be comprised of an electro- optical camera, a range finder, and so on, is mounted to the antenna assembly so that the sensor assembly is aligned with the RF beam formed from the RF energy from the horn antenna that is reflected from the reflector.
  • the sensor assembly is configured to provide information to assist pointing of the antenna assembly toward the obstruction by the antenna pointing assembly.
  • a radar computing device such as a computer, laptop computer, tablet computer, and so on, is provided with a graphical user interface (GUI) that is configured to simplify the setup and operation of the sense through obstruction radar.
  • GUI graphical user interface
  • the GUI provides this functionality through the user interface elements tied to the antenna pointing device, the outputs of the range finder and electro-optical camera, and displays radar data in an easy to understand format.
  • range information is transmitted to the radar computing device, which can utilize the range information to control the timing of transmit-receive cycles associated with the sense- through-obstruction radar system to keep the radar range of interest centered on a target of interest.
  • track data corresponding to a filtered range/range-rate pair associated with a target is also received from the radar computing device.
  • At least one track box may be superimposed over a real-time image that contains the FOV of the sense- through-obstruction radar system. This real-time image and the track box are displayed in the GUI.
  • the track box corresponds to the track data and represents a target detected by the sense-through-obstruction radar system.
  • FIG. 1 illustrates an example implementation of a sense through obstruction radar system 100.
  • the system 100 is comprised of a radar computing device 102 that is connected to a radar assembly 104 through one or more communications cables 106.
  • the radar computing device 102 can be a computer, a laptop computer, a tablet computer and so on that is comprised of at least a processor, memory, a display device, and an input device.
  • the processor provides processing functionality and may execute one or more software programs which implement techniques described herein and may access the memory to store and retrieve data.
  • the memory is tangible computer-readable media that provides storage functionality to store and retrieve various data associated with the operation of the radar computing device, such as the software program, code segments and other types of digital data.
  • the memory may include, for example, removable and non-removable memory elements such as RAM, ROM, Flash (e.g., SD Card, mini-SD card, micro-SD Card), magnetic, optical, USB memory devices, and so forth.
  • the display device provides visual display functionality for the radar computing device and may comprise an LCD (Liquid Crystal Diode) display, a TFT (Thin Film Transistor) LCD display, an LEP (Light Emitting Polymer) or PLED (Polymer Light Emitting Diode) display, and so forth, configured to display text and/or graphical information such as a graphical user interface.
  • the display may be backlit via a backlight such that it may be viewed in the dark or other low- light environments.
  • the input device allows the operator to operate the radar computing device and may be comprised of a keyboard, and/or a pointing device such as a mouse, trackball or touch screen such as a capacitive touch screen, a resistive touch screen, an infrared touch screen, combinations thereof, and the like.
  • the radar assembly 104 includes, but is not necessarily limited to: a transmitter such as a microwave power amplifier, a modulator such as a microwave switch or phase shifter, a receiver such as low-noise microwave amplifier, frequency down converter and an analog to digital converter, and a frequency source(s) such as a voltage controlled oscillator(s) or a frequency synthesizer(s).
  • a transmitter such as a microwave power amplifier
  • a modulator such as a microwave switch or phase shifter
  • a receiver such as low-noise microwave amplifier, frequency down converter and an analog to digital converter
  • a frequency source(s) such as a voltage controlled oscillator(s) or a frequency synthesizer(s).
  • An example radar assembly 104 is described in U.S. Patent No 7,817,082, issued October 19, 2010, which is herein incorporated by reference in its entirety.
  • the communications cable(s) 106 may be any standard communication cable used to connect computing devices to peripherals such as serial, parallel, USB, Ethernet, IEEE 1394
  • the radar assembly 104 is connected to a radar antenna 108 through one or more radio frequency (RP) cables 110.
  • the radar antenna 108 can be any type of high gain antenna such as horn antenna(s), parabolic dish antenna(s), flat panel antenna(s), and so on.
  • the RF cable(s) 110 can be any type of low loss microwave coaxial cable such as RG- 58 A, RG-223, SR-085, SR-141, and so on.
  • the radar computing device 102 is also connected to an antenna pointing assembly 112 through the communications cable(s) 106.
  • the antenna pointing assembly 112 can be any type of gimbal, either electric or hydraulic, that allows the antenna to be pointed at the obstruction of interest.
  • the radar antenna 108 is mounted to the antenna pointing assembly 112 by mounting hardware 114 such as brackets, nuts and bolts, and so on.
  • the radar computing device 102 is also connected to a sensor assembly 116 through communications cable(s) 106.
  • the sensor assembly 116 is comprised of at least a range finder, either optical or RF, and an electro-optical camera such as a visible light camera, a low-light capable visible light camera, an IR camera, and so on.
  • FIG. 2 illustrates an example setup procedure 200 for a sense through obstruction radar system such as the sense through obstruction radar system 100 shown in FIG. 1.
  • the process is initiated by powering up the radar system 202.
  • the operator points the radar antenna 204 using the radar computing device 206. Images from the sensor assembly camera 208 are used to determine that the radar antenna is properly pointed 210.
  • the radar is started 212.
  • the radar range of interest 214 may then be adjusted using the radar computing device 206 and feedback from the sensor assembly range finder 216 to adjust the range of the radar.
  • the range of the radar is set by adjusting the time between the transmitted RF energy and the start of the receiver RF energy measurement. This time is calculated using the range provided by the sensor assembly range finder 216 multiplying it by two (round trip distance) and dividing by the speed of light.
  • the radar may collect and process radar data 220.
  • FIG. 3 illustrates an example radar data processing cycle 300.
  • the cycle (process) 300 is initiated when the radar assembly 302 transmits a radio frequency (RF) pulse 304.
  • the radar assembly 302 receives the reflected RF energy 306 and performs analog processing 308 such as filtering, frequency down conversion, gating, and so on, on the received signal.
  • the processed analog signal is then converted to a digital signal using an analog to digital (A/D) converter 310 which is connected to the radar computing device 312.
  • A/D analog to digital
  • the radar computing device 312 then processes the digital signal using digital signal processing (DSP) techniques 314 such as filtering, frequency down conversion, spectral analysis, and so on.
  • DSP digital signal processing
  • the processed digital signal is then sent to a target detector 316 to determine how many targets are detected, their range, and range rates.
  • the range and range rate detection information is sent to the radar computing device's display 318 for display to the operator.
  • FIG. 4 illustrates an example implementation of the sense through obstruction radar system shown in FIG. 1.
  • the sense through obstruction radar system 400 shown is comprised of a radar computing device 402 that is connected to a radar assembly 404 through one or more communications cables 406.
  • the radar computing device 402 sets RF parameters for the radar assembly 404 to provide a RF signal to a radar antenna 408 via one or more RF cables 410.
  • the radar computing device 402 is also connected to an azimuth and elevation gimbal 412 through communications cable(s) 406.
  • the azimuth and elevation gimbal 412 receives antenna pointing commands from the radar computing device 402 via the communication cable(s) 406.
  • the radar antenna 408 is mounted to the azimuth and elevation gimbal 412 using mounting hardware 414.
  • the radar computing device 402 is also connected to a range finder 416 through communications cable(s) 406.
  • the radar computing device 402 is also connected to a wireless router 418 using one or more network cables 420.
  • a user interface computing device 422 that hosts the user interface used by the operator is connected to the wireless router 418 through a network link 424.
  • the network link 424 can be a network cable(s), a wireless link such as an 802.11 Wi-Fi, Bluetooth, ZigBee, and so on.
  • An electro-optical network camera 426 is connected to the wireless router 418 so that it can provide streaming video to the user interface computing device 422 or the radar computing device 402.
  • a sensor assembly 428 is mounted on the azimuth and elevation gimbal 412 using mounting hardware 414.
  • the sensor assembly 428 is comprised of a range finder 416 and an electro-optical network camera 426. In other implementations, it is contemplated that the sensor assembly 428 may include various other types of sensors/sensing equipment.
  • the senor assembly 428 is aligned with the radar antennas 408 RF beam so that the operator can position the center of the RF beam based on the center of the image provided by the electro-optical network camera 426 using the azimuth and elevation gimbal 412 and the radar computing device 402.
  • a positioning system receiver 430 is connected to the wireless router 418 through the network link 424.
  • the positioning system receiver 430 can be a Global Positioning System (GPS) receiver, a GLONASS receiver, a COMPASS receiver, a GALILEO receiver, a cell tower triangulation receiver, and so on.
  • GPS Global Positioning System
  • GLONASS Global Positioning System
  • COMPASS COMPASS receiver
  • GALILEO GALILEO receiver
  • the positioning system receiver 430 provides latitude, longitude, and altitude information to the user interface computing device 422.
  • the wireless router 418 provides wireless links 432 to remote viewing device(s) 434 so that the information presented on the display of the user interface computing device 422 can be displayed on the remote viewing device(s) 434.
  • the remote viewing device(s) 434 can be a computer(s), a laptop computer(s), a tablet computer(s), a hand-held computer(s) such as an IPOD brand handheld computer, a smart phone(s) such as an IPHONE brand smart phone, a BLACKBERRY brand smart phone, or an ANDROID based smart phone, and so on, or any combination thereof.
  • Power is provided to the electrical components from a suitable power source such as a battery (e.g., a 24V battery (not shown)), or the like).
  • the system includes the software hosted on the radar computing device 402 for controlling the azimuth and elevation gimbal 412, the radar control and signal processing software hosted on the radar computing device 402, and the user interface and data display software hosted on the user interface computing device 422.
  • FIG. 5 depicts an example radar antenna assembly 500 of the sense through obstruction radar system 100 shown in FIG. 1.
  • the radar antenna assembly 500 includes a horn antenna 502 that is mounted to a parabolic dish reflector 504 in an offset feed configuration using support arms 506.
  • Horn antenna 502 and parabolic dish reflector 504 constitute a high-gain radio frequency (RF) antenna.
  • Range finder 508 and electro-optical camera 510 are mounted on support arms 506 on either side of the horn antenna 502 so that they are aligned with the RF beam that is formed from the RF energy that is reflected from the parabolic dish reflector 504. This alignment of the center of the RF beam, electro- optical cameras FOV, and the range finder beam is illustrated in FIG. 6.
  • FIG. 6 depicts an example radar antenna assembly 600 of the sense through obstruction radar system 100 shown in FIG. 1.
  • the RF beam 602 is formed from the RF energy that is transmitted from the horn antenna 604 and reflected off of the parabolic dish reflector 606 and has a field of view (FOV) 608.
  • An electro-optical camera 610 is mounted next to the horn antenna 604 such that the camera FOV 612 has its center aligned with the center of the RF beam FOV 608. This center point is indicated by cross-hairs 614.
  • a range finder 616 is mounted next to the horn antenna 604 such that the center of the range finder beam 618 is incident upon the center of the RF beam FOV 608 and the center of the cameras FOV 612.
  • FIG. 7 illustrates a radar data processing cycle 700 for the sense through obstruction radar shown in FIG. 1.
  • the cycle (process) 700 starts when the radar assembly 702 transmits a radio frequency (RF) pulse 704.
  • the radar assembly 702 receives the reflected RF energy 706 and performs analog processing 708 on the received signal.
  • the processed analog signal is then converted to a digital signal using an analog to digital (A/D) converter 710 that is connected to a radar computing device 712.
  • the radar computing device 712 then converts the digital signal from the time domain to the frequency domain using digital signal processing (DSP) techniques 714 such as a discrete Fourier transform, wavelet transform, and so on. Additional DSP techniques such as filtering, are used to suppress the clutter in the signal 716.
  • DSP digital signal processing
  • the processed digital signal is then sent to a detector 718 that correlates the signal with a plurality of spectral templates 720 to determine how many targets are detected, their range, and range rates.
  • the range and range rate detection information is sent to a tracker 722.
  • the tracker 722 creates, destroys, and updates tracks (filtered range/range-rate pairs) based on whether the received range/range-rate data is associated with an existing track or represents a new track. When an existing track does not receive an update from the tracker within a set period of time the track is eliminated.
  • the tracker 722 may employ nearest neighbor logic to associate new data with existing tracks and a Kalman Filter, to update and propagate the tracks. However, it is contemplated that other techniques may be employed to achieve similar results.
  • Valid track data is furnished to a computing device display 724, which could be the display connected to the radar computing device 712, or the display connected to another computing device that is connected to the radar computing device 712 using a network as shown in FIG. 4.
  • the radar computing device 712 which includes or is connected to the display 724, uses the range-rate information to determine when the micro-Doppler signature represents a moving or stationary object, and color codes the displayed track information accordingly.
  • FIG. 8 illustrates an example Graphical User Interface (GUI) 800 configured to operate a sense through obstruction radar system such as the sense through obstruction radar system 100 shown in FIG. 1 and described above.
  • the GUI 800 may be implemented as a set of instructions (software) that can be hosted on a user interface computing device 422 or on another computing device that is connected to the user interface computing device 422 via a network as shown in FIG. 4.
  • the GUI 800 furnishes video display functionality to the operator or other users of the radar system.
  • the video data received from the electro-optical camera 428 shown in FIG. 4 may be displayed in real-time 802. Several additional visual elements can be displayed over the real-time video.
  • the first visual overlay includes crosshairs 804 that represent the center of the cameras field of view (FOV) 612 and the center of the RF beams FOV 608 by virtue of the alignment of the camera FOV 612 and the RF beams FOV 608 as shown in FIG 6.
  • the crosshairs 804 help the operator point the azimuth and elevation gimbal 412, shown in FIG. 4, so that the RF beams FOV 608, shown in FIG. 6, is pointed at the obstruction of interest.
  • the second visual overlay includes a circle 806 that represents the RF beams FOV 608, shown in FIG. 6.
  • the circle 806 helps the operator determine where it is possible for the radar to detect human micro-Doppler signatures.
  • the third visual overlay includes the ground/floor indicator 808 that helps the operator position the elevation angle of the azimuth and elevation gimbal 412, shown in FIG. 4, so that the center of the RF beams FOV 608, shown in FIG. 6, is at an optimal height for detecting human micro-Doppler signatures.
  • Another visual overlay may include the track indicator(s) 810 that represents a valid track or detection of a micro-Doppler signature. Each valid track may have an associated track indicator 810 overlaid on the real-time video display 802. The track indicator(s) 810 can also display the range of the detection behind the obstruction, in this case the front wall of a building.
  • the track indicator(s) 810 can also be configured to show the range from the sense through obstruction radar system 100, shown in FIG. 1, instead of the range behind the obstruction.
  • the track indicator(s) 810 can be color coded to indicate when the detection represents a moving or stationary micro-Doppler signature.
  • the number and types of tracks are shown in the Track Data area 812 of the GUI 800.
  • the local date and time are shown in the Local Date/Time area 814 of the GUI 800.
  • the sense through obstruction radar latitude, longitude, and heading information provided by the positioning system receiver 430, shown in FIG. 4, are displayed in the System Information area 816 of the GUI 800.
  • the System Information area 816 is the range of the obstruction from the sense through obstruction radar provided by the range finder 416 shown in FIG. 4, and the geographic coordinates (e.g., latitude and longitude) of the obstruction computed by the radar computing device 712 using the latitude, longitude, and/or heading of the radar along with the range to the obstruction.
  • the Zoom Control area 818 of the GUI 800 may be used to control the zoom level of the electro-optical camera 428 shown in FIG. 4.
  • the System Status area 820 of the GUI 800 may be used to monitor the health of all the communications interfaces.
  • the Gimbal Control area 822 of the GUI 800 may be used to monitor and control the azimuth and elevation gimbal 412, shown in FIG. 4.
  • the Display Control area 824 of the GUI 800 may be used to control the display of different visual elements or data on the display.
  • the Radar Start/Stop button 826 may be used to turn the radar assembly 404 shown in FIG. 4 on or off.

Abstract

A standoff range, sense-through-obstruction radar system is capable of detecting micro-Doppler, or life form signatures, and movements through obstructions at stand-off ranges and displaying the target information over a live video feed of the area under surveillance. The sense-through-obstruction radar system comprises an antenna assembly that includes a horn antenna and a reflector configured to reflect radio frequency (RF) energy to/from the horn antenna. An antenna pointing assembly supports the antenna assembly. The antenna pointing assembly is configured to move the antenna assembly to point the antenna assembly toward an obstruction. A sensor assembly is mounted to the antenna assembly so that the sensor assembly is aligned with the RF beam formed from the RF energy reflected from the reflector to the horn antenna. The sensor assembly is configured to detect the location of the obstruction and to provide information to assist pointing of the antenna assembly toward the obstruction.

Description

STANDOFF RANGE SENSE THROUGH OBSTRUCTION RADAR SYSTEM
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims the benefit under 35 U.S.C. § 119(e), of U.S. Provisional Application Serial No. 61/257,469, filed November 2, 2009, which is herein incorporated by reference in its entirety.
BACKGROUND
[0002] Sense through obstruction radar systems allow users to gain actionable intelligence through obstructions such as building walls, walls, fences, and foliage. These radars may be used by the military, police, security, and firemen to provide a capability of detecting, locating, identifying, and classifying moving and stationary humans for rescue and clearing operations. Sense through obstruction radars include a transmitter that transmits electromagnetic waves that are reflected by objects and are then detected by the radar's receiver. The transmitted waves interact with objects that change the properties of the returned waves. When an object is moving at a constant velocity, the returned wave is shifted in frequency, which is called the Doppler Effect. The larger the velocity, the larger the frequency shift. When the object is moving towards the radar the frequency of the returned wave is increased. Conversely, when the object is moving away from the radar, the frequency of the returned wave is decreased. When the target is not moving but is vibrating the returned signal exhibits frequency sidebands called micro-Doppler. Because electromagnetic waves travel roughly at the speed of light, the round trip time from the radar to the target provides information on the range of the target. Depending on the material, some portion of the electromagnetic waves penetrates through obstructions such as walls, but the amplitude of the waves is attenuated. For a given material, the lower the frequency of the wave, the less attenuation electromagnetic wave exhibits. As the frequency of the electromagnetic waves decreases, the difficulty of measuring micro- Doppler from human or animal life forms increases. Radars usually operate in the frequency range of 300 MHz to 8 GHz to use the properties of electromagnetic waves that can penetrate through obstructions, while measuring Doppler and micro-Doppler effects of human or animal life-forms. However, detecting a human or animal life-form behind an obstruction is difficult because the transmitted and reflected waves are both attenuated by the obstruction. This makes detecting the Doppler and micro-Doppler due to the human or animal life-form difficult, especially in the presence of noise that is inherent in a radar. Another difficulty encountered when detecting slow moving and vibrating objects is that the frequency shifts and the signals are extremely small, making it difficult to detect these shifts in the presence of the stationary objects in the radars field of view, especially in the presence of noise.
SUMMARY
[0003] A standoff range, sense-through-obstruction radar system is disclosed that is capable of detecting micro-Doppler, or life form signatures, and movements through obstructions at stand-off ranges and a method of displaying the target information over a live video feed of the area under surveillance. In an implementation, the sense-through- obstruction radar system comprises an antenna assembly that includes a horn antenna and a reflector configured to reflect radio frequency (RF) energy to/from the horn antenna. An antenna pointing assembly supports the antenna assembly. The antenna pointing assembly is configured to move the antenna assembly to point the antenna assembly toward an obstruction. A sensor assembly is mounted to the antenna assembly so that the sensor assembly is aligned with the RF beam formed from the RF energy reflected from the reflector to the horn antenna. The sensor assembly (e.g., a range finder and an electro- optical camera) is configured to detect the location of the obstruction and to provide information to assist pointing of the antenna assembly toward the obstruction by the antenna pointing assembly. The radar system may include a radar computing device configured to direct movement of the antenna assembly by the antenna pointing assembly in response to the detection of the location of the obstruction by the sensor assembly.
[0004] During operation of the sense-through-obstruction radar system, range information is transmitted to the radar computing device, which is operable to utilize the range information to configure timing of transmit-receive cycles associated with the sense- through-obstruction radar system. Track data corresponding to a filtered range/range-rate pair is also received from the radar computing device. At least one track box may be superimposed over a real-time image that represents a field of view of the sense-through- obstruction radar system. The track box corresponds to the track data and represents a target detected by the sense-through-obstruction radar system.
[0005] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
DRAWINGS
[0006] The detailed description is described with reference to the accompanying figures. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
[0007] FIG. 1 is a block diagram illustrating an example implementation of a standoff range sense through obstruction radar that is capable of detecting micro-Doppler, or life form signatures, and movements through obstructions at stand-off ranges and a method of displaying the target information over a live video feed of the area under surveillance.
[0008] FIG. 2 is a flow chart illustrating an example setup procedure for a sense through obstruction radar system shown in FIG. 1.
[0009] FIG. 3 is a flow chart illustrating an example radar data processing cycle for a sense through obstruction radar system shown in FIG. 1.
[0010] FIG. 4 is a block diagram illustrating an example implementation of the sense through obstruction radar shown in FIG. 1.
[0011] FIG. 5 is a perspective view depicting an example implementation of the sense through obstruction radar system shown in FIGS. 1 and 2, further illustrating the radar antenna, sensor assembly and azimuth and elevation gimbal used for pointing the antenna.
[0012] FIG. 6 is a diagrammatic pictorial representation illustrating the alignment of the sensor assembly and the radars RF beam for the sense through obstruction radar system shown in FIG. 1, wherein the sensor assembly is comprised of a range finder and an electro-optical camera, and the alignment of the range finder beam, the electro-optical cameras field of view (FOV), and the radars RF beam is shown.
[0013] FIG. 7 is a flow chart illustrating an example radar data processing cycle for the sense through obstruction radar system shown in FIG. 1.
[0014] FIG. 8 is a diagrammatic pictorial representation illustrating a graphical user interface (GUI) suitable to operate the sense through obstruction radar system shown in FIG. 1.
DETAILED DESCRIPTION
Overview
[0015] Stand-off range sense through obstruction radars furnish enhanced capability to detect moving and stationary micro-Doppler, or life-form, signatures for rescue and clearing operations. Typical obstructions include walls of buildings, foliage, and so forth, but could be any type of obstruction except for solid metal obstructions. Stand-off range sense through obstruction radars can be used by the military, police, security, and firemen. Additionally, the radars can provide standoff range human biometric monitoring for medical personnel to help save lives (e.g., battlefield wounded). It is also desirable that these radars be able to detect very low velocity motion and small motion (also known as micro-Doppler), as exhibited by life-forms, in the presence of all the stationary objects, or clutter, that are in the radars field of view (FOV) and range of interest. It is also desirable that these radars be capable of operating at stand-off ranges greater than or equal to at least twenty (20) meters either as a requirement of the application or to provide safety or stealth to the operators. Functionally, to be useful to the military, police, security, firemen, and medical personnel, it is desirable that these radars be easy to setup, operate, and present the target information in an easy to understand format to the operator.
[0016] Accordingly, a standoff range, sense-through-obstruction radar system is disclosed that is capable of detecting micro-Doppler, or life form signatures, and movements through obstructions at stand-off ranges and a method of displaying the target information over a live video feed of the area under surveillance. In an implementation, the sense-through- obstruction radar system comprises an antenna assembly that includes a horn antenna and a reflector configured to reflect radio frequency (RF) energy to/from the horn antenna. This horn and reflector pair constitutes a high gain antenna assembly that provides sufficient gain to enable the system to operate at stand-off ranges. The high gain antenna is mounted to an antenna pointing assembly that is configured to point the antenna assembly towards the obstruction of interest. A sensor assembly, which may be comprised of an electro- optical camera, a range finder, and so on, is mounted to the antenna assembly so that the sensor assembly is aligned with the RF beam formed from the RF energy from the horn antenna that is reflected from the reflector. The sensor assembly is configured to provide information to assist pointing of the antenna assembly toward the obstruction by the antenna pointing assembly.
[0017] A radar computing device such as a computer, laptop computer, tablet computer, and so on, is provided with a graphical user interface (GUI) that is configured to simplify the setup and operation of the sense through obstruction radar. The GUI provides this functionality through the user interface elements tied to the antenna pointing device, the outputs of the range finder and electro-optical camera, and displays radar data in an easy to understand format. During operation of the sense-through-obstruction radar system, range information is transmitted to the radar computing device, which can utilize the range information to control the timing of transmit-receive cycles associated with the sense- through-obstruction radar system to keep the radar range of interest centered on a target of interest. Additionally, track data corresponding to a filtered range/range-rate pair associated with a target is also received from the radar computing device. At least one track box may be superimposed over a real-time image that contains the FOV of the sense- through-obstruction radar system. This real-time image and the track box are displayed in the GUI. The track box corresponds to the track data and represents a target detected by the sense-through-obstruction radar system.
Example Implementations
[0018] FIG. 1 illustrates an example implementation of a sense through obstruction radar system 100. The system 100 is comprised of a radar computing device 102 that is connected to a radar assembly 104 through one or more communications cables 106. The radar computing device 102 can be a computer, a laptop computer, a tablet computer and so on that is comprised of at least a processor, memory, a display device, and an input device. The processor provides processing functionality and may execute one or more software programs which implement techniques described herein and may access the memory to store and retrieve data.
[0019] The memory is tangible computer-readable media that provides storage functionality to store and retrieve various data associated with the operation of the radar computing device, such as the software program, code segments and other types of digital data. The memory may include, for example, removable and non-removable memory elements such as RAM, ROM, Flash (e.g., SD Card, mini-SD card, micro-SD Card), magnetic, optical, USB memory devices, and so forth.
[0020] The display device provides visual display functionality for the radar computing device and may comprise an LCD (Liquid Crystal Diode) display, a TFT (Thin Film Transistor) LCD display, an LEP (Light Emitting Polymer) or PLED (Polymer Light Emitting Diode) display, and so forth, configured to display text and/or graphical information such as a graphical user interface. The display may be backlit via a backlight such that it may be viewed in the dark or other low- light environments.
[0021] The input device allows the operator to operate the radar computing device and may be comprised of a keyboard, and/or a pointing device such as a mouse, trackball or touch screen such as a capacitive touch screen, a resistive touch screen, an infrared touch screen, combinations thereof, and the like.
[0022] The radar assembly 104 includes, but is not necessarily limited to: a transmitter such as a microwave power amplifier, a modulator such as a microwave switch or phase shifter, a receiver such as low-noise microwave amplifier, frequency down converter and an analog to digital converter, and a frequency source(s) such as a voltage controlled oscillator(s) or a frequency synthesizer(s). An example radar assembly 104 is described in U.S. Patent No 7,817,082, issued October 19, 2010, which is herein incorporated by reference in its entirety. The communications cable(s) 106 may be any standard communication cable used to connect computing devices to peripherals such as serial, parallel, USB, Ethernet, IEEE 1394, PCI Express, and so on.
[0023] The radar assembly 104 is connected to a radar antenna 108 through one or more radio frequency (RP) cables 110. The radar antenna 108 can be any type of high gain antenna such as horn antenna(s), parabolic dish antenna(s), flat panel antenna(s), and so on. The RF cable(s) 110 can be any type of low loss microwave coaxial cable such as RG- 58 A, RG-223, SR-085, SR-141, and so on.
[0024] The radar computing device 102 is also connected to an antenna pointing assembly 112 through the communications cable(s) 106. The antenna pointing assembly 112 can be any type of gimbal, either electric or hydraulic, that allows the antenna to be pointed at the obstruction of interest. The radar antenna 108 is mounted to the antenna pointing assembly 112 by mounting hardware 114 such as brackets, nuts and bolts, and so on. The radar computing device 102 is also connected to a sensor assembly 116 through communications cable(s) 106. The sensor assembly 116 is comprised of at least a range finder, either optical or RF, and an electro-optical camera such as a visible light camera, a low-light capable visible light camera, an IR camera, and so on.
[0025] FIG. 2 illustrates an example setup procedure 200 for a sense through obstruction radar system such as the sense through obstruction radar system 100 shown in FIG. 1. As shown, the process is initiated by powering up the radar system 202. The operator then points the radar antenna 204 using the radar computing device 206. Images from the sensor assembly camera 208 are used to determine that the radar antenna is properly pointed 210. Once the radar antenna is properly pointed 210, the radar is started 212. The radar range of interest 214 may then be adjusted using the radar computing device 206 and feedback from the sensor assembly range finder 216 to adjust the range of the radar. The range of the radar is set by adjusting the time between the transmitted RF energy and the start of the receiver RF energy measurement. This time is calculated using the range provided by the sensor assembly range finder 216 multiplying it by two (round trip distance) and dividing by the speed of light. When the radar range is correct 218, the radar may collect and process radar data 220.
[0026] FIG. 3 illustrates an example radar data processing cycle 300. The cycle (process) 300 is initiated when the radar assembly 302 transmits a radio frequency (RF) pulse 304. The radar assembly 302 then receives the reflected RF energy 306 and performs analog processing 308 such as filtering, frequency down conversion, gating, and so on, on the received signal. The processed analog signal is then converted to a digital signal using an analog to digital (A/D) converter 310 which is connected to the radar computing device 312. The radar computing device 312 then processes the digital signal using digital signal processing (DSP) techniques 314 such as filtering, frequency down conversion, spectral analysis, and so on. The processed digital signal is then sent to a target detector 316 to determine how many targets are detected, their range, and range rates. The range and range rate detection information is sent to the radar computing device's display 318 for display to the operator.
[0027] FIG. 4 illustrates an example implementation of the sense through obstruction radar system shown in FIG. 1. The sense through obstruction radar system 400 shown is comprised of a radar computing device 402 that is connected to a radar assembly 404 through one or more communications cables 406. The radar computing device 402 sets RF parameters for the radar assembly 404 to provide a RF signal to a radar antenna 408 via one or more RF cables 410. The radar computing device 402 is also connected to an azimuth and elevation gimbal 412 through communications cable(s) 406. The azimuth and elevation gimbal 412 receives antenna pointing commands from the radar computing device 402 via the communication cable(s) 406. The radar antenna 408 is mounted to the azimuth and elevation gimbal 412 using mounting hardware 414. The radar computing device 402 is also connected to a range finder 416 through communications cable(s) 406. The radar computing device 402 is also connected to a wireless router 418 using one or more network cables 420. A user interface computing device 422 that hosts the user interface used by the operator is connected to the wireless router 418 through a network link 424. The network link 424 can be a network cable(s), a wireless link such as an 802.11 Wi-Fi, Bluetooth, ZigBee, and so on. An electro-optical network camera 426 is connected to the wireless router 418 so that it can provide streaming video to the user interface computing device 422 or the radar computing device 402. A sensor assembly 428 is mounted on the azimuth and elevation gimbal 412 using mounting hardware 414. In the implementation illustrated, the sensor assembly 428 is comprised of a range finder 416 and an electro-optical network camera 426. In other implementations, it is contemplated that the sensor assembly 428 may include various other types of sensors/sensing equipment. The senor assembly 428 is aligned with the radar antennas 408 RF beam so that the operator can position the center of the RF beam based on the center of the image provided by the electro-optical network camera 426 using the azimuth and elevation gimbal 412 and the radar computing device 402. A positioning system receiver 430 is connected to the wireless router 418 through the network link 424. The positioning system receiver 430 can be a Global Positioning System (GPS) receiver, a GLONASS receiver, a COMPASS receiver, a GALILEO receiver, a cell tower triangulation receiver, and so on. The positioning system receiver 430 provides latitude, longitude, and altitude information to the user interface computing device 422. The wireless router 418 provides wireless links 432 to remote viewing device(s) 434 so that the information presented on the display of the user interface computing device 422 can be displayed on the remote viewing device(s) 434. The remote viewing device(s) 434 can be a computer(s), a laptop computer(s), a tablet computer(s), a hand-held computer(s) such as an IPOD brand handheld computer, a smart phone(s) such as an IPHONE brand smart phone, a BLACKBERRY brand smart phone, or an ANDROID based smart phone, and so on, or any combination thereof. Power is provided to the electrical components from a suitable power source such as a battery (e.g., a 24V battery (not shown)), or the like). The system includes the software hosted on the radar computing device 402 for controlling the azimuth and elevation gimbal 412, the radar control and signal processing software hosted on the radar computing device 402, and the user interface and data display software hosted on the user interface computing device 422.
[0028] FIG. 5 depicts an example radar antenna assembly 500 of the sense through obstruction radar system 100 shown in FIG. 1. The radar antenna assembly 500 includes a horn antenna 502 that is mounted to a parabolic dish reflector 504 in an offset feed configuration using support arms 506. Horn antenna 502 and parabolic dish reflector 504 constitute a high-gain radio frequency (RF) antenna. Range finder 508 and electro-optical camera 510 are mounted on support arms 506 on either side of the horn antenna 502 so that they are aligned with the RF beam that is formed from the RF energy that is reflected from the parabolic dish reflector 504. This alignment of the center of the RF beam, electro- optical cameras FOV, and the range finder beam is illustrated in FIG. 6. This subassembly is mounted to an electro-mechanical azimuth and elevation antenna pointing device 512. [0029] FIG. 6 depicts an example radar antenna assembly 600 of the sense through obstruction radar system 100 shown in FIG. 1. The RF beam 602 is formed from the RF energy that is transmitted from the horn antenna 604 and reflected off of the parabolic dish reflector 606 and has a field of view (FOV) 608. An electro-optical camera 610 is mounted next to the horn antenna 604 such that the camera FOV 612 has its center aligned with the center of the RF beam FOV 608. This center point is indicated by cross-hairs 614. A range finder 616 is mounted next to the horn antenna 604 such that the center of the range finder beam 618 is incident upon the center of the RF beam FOV 608 and the center of the cameras FOV 612.
[0030] FIG. 7 illustrates a radar data processing cycle 700 for the sense through obstruction radar shown in FIG. 1. The cycle (process) 700 starts when the radar assembly 702 transmits a radio frequency (RF) pulse 704. The radar assembly 702 then receives the reflected RF energy 706 and performs analog processing 708 on the received signal. The processed analog signal is then converted to a digital signal using an analog to digital (A/D) converter 710 that is connected to a radar computing device 712. The radar computing device 712 then converts the digital signal from the time domain to the frequency domain using digital signal processing (DSP) techniques 714 such as a discrete Fourier transform, wavelet transform, and so on. Additional DSP techniques such as filtering, are used to suppress the clutter in the signal 716. The processed digital signal is then sent to a detector 718 that correlates the signal with a plurality of spectral templates 720 to determine how many targets are detected, their range, and range rates. The range and range rate detection information is sent to a tracker 722. The tracker 722 creates, destroys, and updates tracks (filtered range/range-rate pairs) based on whether the received range/range-rate data is associated with an existing track or represents a new track. When an existing track does not receive an update from the tracker within a set period of time the track is eliminated. In one or more implementations, the tracker 722 may employ nearest neighbor logic to associate new data with existing tracks and a Kalman Filter, to update and propagate the tracks. However, it is contemplated that other techniques may be employed to achieve similar results. Valid track data is furnished to a computing device display 724, which could be the display connected to the radar computing device 712, or the display connected to another computing device that is connected to the radar computing device 712 using a network as shown in FIG. 4. The radar computing device 712, which includes or is connected to the display 724, uses the range-rate information to determine when the micro-Doppler signature represents a moving or stationary object, and color codes the displayed track information accordingly.
[0031] FIG. 8 illustrates an example Graphical User Interface (GUI) 800 configured to operate a sense through obstruction radar system such as the sense through obstruction radar system 100 shown in FIG. 1 and described above. The GUI 800 may be implemented as a set of instructions (software) that can be hosted on a user interface computing device 422 or on another computing device that is connected to the user interface computing device 422 via a network as shown in FIG. 4. The GUI 800 furnishes video display functionality to the operator or other users of the radar system. In implementations, the video data received from the electro-optical camera 428 shown in FIG. 4 may be displayed in real-time 802. Several additional visual elements can be displayed over the real-time video. For example, the following list is representative of visual elements that can provide additional information to the user but does not include all possible useful visual elements. The first visual overlay includes crosshairs 804 that represent the center of the cameras field of view (FOV) 612 and the center of the RF beams FOV 608 by virtue of the alignment of the camera FOV 612 and the RF beams FOV 608 as shown in FIG 6. The crosshairs 804 help the operator point the azimuth and elevation gimbal 412, shown in FIG. 4, so that the RF beams FOV 608, shown in FIG. 6, is pointed at the obstruction of interest. The second visual overlay includes a circle 806 that represents the RF beams FOV 608, shown in FIG. 6. The circle 806 helps the operator determine where it is possible for the radar to detect human micro-Doppler signatures. The third visual overlay includes the ground/floor indicator 808 that helps the operator position the elevation angle of the azimuth and elevation gimbal 412, shown in FIG. 4, so that the center of the RF beams FOV 608, shown in FIG. 6, is at an optimal height for detecting human micro-Doppler signatures. Another visual overlay may include the track indicator(s) 810 that represents a valid track or detection of a micro-Doppler signature. Each valid track may have an associated track indicator 810 overlaid on the real-time video display 802. The track indicator(s) 810 can also display the range of the detection behind the obstruction, in this case the front wall of a building. This value can change when updated valid track information is received from the radar computing device 712 shown in FIG. 7. The track indicator(s) 810 can also be configured to show the range from the sense through obstruction radar system 100, shown in FIG. 1, instead of the range behind the obstruction. The track indicator(s) 810 can be color coded to indicate when the detection represents a moving or stationary micro-Doppler signature. The number and types of tracks are shown in the Track Data area 812 of the GUI 800. The local date and time are shown in the Local Date/Time area 814 of the GUI 800. The sense through obstruction radar latitude, longitude, and heading information provided by the positioning system receiver 430, shown in FIG. 4, are displayed in the System Information area 816 of the GUI 800. Also shown in the System Information area 816 is the range of the obstruction from the sense through obstruction radar provided by the range finder 416 shown in FIG. 4, and the geographic coordinates (e.g., latitude and longitude) of the obstruction computed by the radar computing device 712 using the latitude, longitude, and/or heading of the radar along with the range to the obstruction. The Zoom Control area 818 of the GUI 800 may be used to control the zoom level of the electro-optical camera 428 shown in FIG. 4. The System Status area 820 of the GUI 800 may be used to monitor the health of all the communications interfaces. The Gimbal Control area 822 of the GUI 800 may be used to monitor and control the azimuth and elevation gimbal 412, shown in FIG. 4. The Display Control area 824 of the GUI 800 may be used to control the display of different visual elements or data on the display. The Radar Start/Stop button 826 may be used to turn the radar assembly 404 shown in FIG. 4 on or off.
Conclusion
[0032] Although the subject matter has been described in language specific to structural features and/or process operations, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims

CLAIMS What is claimed is:
1. A sense-through-obstruction radar system comprising:
an antenna assembly including a reflector and a horn antenna, the reflector configured to reflect RF energy to the horn antenna;
an antenna pointing assembly supporting the antenna assembly, the antenna pointing assembly configured to move the antenna assembly to point the antenna assembly toward an obstruction; and
a sensor assembly mounted to the antenna assembly so that the sensor assembly is aligned with an RF beam formed from the RF energy reflected from the reflector to the horn antenna, the sensor assembly configured to detect the location of the obstruction to direct pointing of the antenna assembly toward the obstruction by the antenna pointing assembly.
2. The sense-through-obstruction radar system as recited in claim 1, further comprising a radar computing device configured to direct movement of the antenna assembly by the antenna pointing assembly responsive to detection of the location of the obstruction by the sensor assembly.
3. The sense-through-obstruction radar system as recited in claim 2, wherein the sensor assembly comprises a camera, the camera configured to capture an image of the obstruction.
4. The sense-through-obstruction radar system as recited in claim 3, wherein the camera comprises an electro-optical camera configured to capture at least one of a photographic image or video of the obstruction.
5. The standoff range, sense-through-obstruction radar system as recited in claim 2, wherein the camera comprises an infrared camera configured to capture an infrared image of the obstruction.
6. The sense-through-obstruction radar system as recited in claim 5, wherein the sensor assembly comprises a range finder, the range finder configured to detect a range from the antenna assembly to the obstruction.
7. The sense-through-obstruction radar system as recited in claim 6, wherein the camera and the range finder are mounted to the antenna assembly adjacent to the horn antenna.
8. The sense-through-obstruction radar system as recited in claim 7, wherein the antenna assembly further comprises at least one support arm configured to support the reflector and the horn antenna in an offset feed arrangement so that the horn antenna is positioned to receive RF energy reflected from the refiector, the support arm supporting the camera and range finder.
9. The sense-through-obstruction radar system as recited in claim 2, wherein the radar director assembly comprises a range finder, the range finder configured to detect a range from the antenna assembly to the obstruction.
10. The sense-through-obstruction radar system as recited in claim 2, wherein the antenna pointing assembly comprises a gimbal configured to control the azimuth and elevation of the antenna assembly.
11. A method for operating a sense-through-obstruction radar system, comprising:
transmitting range information to a radar computing device, the radar computing device operable to utilize the range information to configure a timing of a transmit-receive cycle associated with the sense-through-obstruction radar system;
receiving track data corresponding to a filtered range/range-rate pair from the radar computing device; and causing at least one track box to be superimposed over a real-time image that represents a field of view of the sense-through-obstruction radar system, the at least one track box corresponding to the track data and representing a target detected by the sense- through-obstruction radar system.
12. The method as recited in claim 11, wherein the causing at least one track box to be superimposed over a real-time image comprises causing at least one track box to have a first hue when the target is stationary and a second hue when the target is in motion.
13. The method as recited in claim 11, wherein the causing at least one track box to be superimposed over a real-time image comprises causing at least one track box to be represented as a human avatar.
14. The method as recited in claim 13, wherein the causing at least one track box to be superimposed over a real-time image further comprises causing at least one track box to be represented as a three-dimensional human avatar.
15. The method as recited in claim 12, wherein the first hue comprises red and the second hue comprises green.
16. A sense-through-obstruction radar system comprising :
a radar operable to furnish sense-through-obstruction target detection;
a mobile computing device communicatively coupled to the radar, the mobile computing device further including:
a display device;
a memory operable to store one or more modules; and
a processing system operable to execute the one or more modules to:
transmit range information to a radar computing device that is operable to utilize the range information to configure a timing of a transmit-receive cycle associated with the sense-through-obstruction radar system; receive track data corresponding to a filtered range/range-rate pair from the radar computing device; and cause at least one track box to be superimposed over a real-time image displayed by the display device, the real-time image representing a field of view of the radar, the at least one track box corresponding to the track data and representing a target detected by the radar.
17. The sense-through-obstruction radar system as recited in claim 16, further comprising a display module operable to cause the at least one track box to have a first hue when the target is stationary and to have a second hue when the target is in motion.
18. The sense-through-obstruction radar system as recited in claim 16, wherein the module is configured to cause the at least one track box to be represented as a human avatar.
19. The sense-through-obstruction radar system as recited in claim 18, wherein the module is further configured to cause the at least one track box to be represented as a three-dimensional human avatar.
20. The sense-through-obstruction radar system as recited in claim 16, wherein the first hue comprises red and the second hue comprises green.
PCT/US2010/054775 2009-11-03 2010-10-29 Standoff range sense through obstruction radar system WO2011056730A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1209474.4A GB2488699B (en) 2009-11-03 2010-10-29 Standoff range sense through obstruction radar system
IL219519A IL219519A (en) 2009-11-03 2012-05-01 Standoff range sense-through-obstruction radar system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US25746909P 2009-11-03 2009-11-03
US61/257,469 2009-11-03
US12/916,008 US20110102234A1 (en) 2009-11-03 2010-10-29 Standoff range sense through obstruction radar system
US12/916,008 2010-10-29

Publications (2)

Publication Number Publication Date
WO2011056730A2 true WO2011056730A2 (en) 2011-05-12
WO2011056730A3 WO2011056730A3 (en) 2011-08-04

Family

ID=43924834

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/054775 WO2011056730A2 (en) 2009-11-03 2010-10-29 Standoff range sense through obstruction radar system

Country Status (5)

Country Link
US (2) US20110102234A1 (en)
GB (1) GB2488699B (en)
IL (1) IL219519A (en)
TR (1) TR201205754T1 (en)
WO (1) WO2011056730A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110618465A (en) * 2018-06-04 2019-12-27 富士通株式会社 Article detection method and apparatus

Families Citing this family (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL192601A (en) * 2008-07-03 2014-07-31 Elta Systems Ltd Sensing/emitting apparatus, system and method
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US20150205111A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. Optical configurations for head worn computing
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
GB2488699B (en) * 2009-11-03 2014-09-17 Vawd Applied Science & Technology Corp Standoff range sense through obstruction radar system
US8593329B2 (en) * 2010-03-17 2013-11-26 Tialinx, Inc. Hand-held see-through-the-wall imaging and unexploded ordnance (UXO) detection system
US20120154239A1 (en) * 2010-12-15 2012-06-21 Bridgewave Communications, Inc. Millimeter wave radio assembly with a compact antenna
US9696421B2 (en) 2013-03-15 2017-07-04 Src, Inc. Through wall sensing system using WiFi/cellular radar
CN103245976B (en) * 2013-05-23 2016-01-20 中国人民解放军第四军医大学 Based on human body target and the surrounding environment structure compatible detection method of UWB bioradar
WO2015084917A1 (en) * 2013-12-03 2015-06-11 Edh Us Llc Antenna with boresight optical system
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US20160019715A1 (en) 2014-07-15 2016-01-21 Osterhout Group, Inc. Content presentation in head worn computing
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US20150277118A1 (en) 2014-03-28 2015-10-01 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US20150228119A1 (en) 2014-02-11 2015-08-13 Osterhout Group, Inc. Spatial location presentation in head worn computing
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9651788B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9811159B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9532714B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US20150205135A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. See-through computer display systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
JP6428270B2 (en) * 2014-02-10 2018-11-28 株式会社デンソー Axis deviation detector
US20150241963A1 (en) 2014-02-11 2015-08-27 Osterhout Group, Inc. Eye imaging in head worn computing
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9852545B2 (en) 2014-02-11 2017-12-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US20160187651A1 (en) 2014-03-28 2016-06-30 Osterhout Group, Inc. Safety for a vehicle operator with an hmd
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
CN106537179B (en) * 2014-06-30 2019-04-19 博迪戴特股份有限公司 For determining the hand-held multisensor syste of the size of irregular object
US9720072B2 (en) * 2014-08-28 2017-08-01 Waymo Llc Methods and systems for vehicle radar coordination and interference reduction
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
US20160239985A1 (en) 2015-02-17 2016-08-18 Osterhout Group, Inc. See-through computer display systems
US10591728B2 (en) 2016-03-02 2020-03-17 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10667981B2 (en) 2016-02-29 2020-06-02 Mentor Acquisition One, Llc Reading assistance system for visually impaired
US10571591B2 (en) 2016-04-28 2020-02-25 Fluke Corporation RF in-wall image registration using optically-sensed markers
US10585203B2 (en) 2016-04-28 2020-03-10 Fluke Corporation RF in-wall image visualization
US10254398B2 (en) 2016-04-28 2019-04-09 Fluke Corporation Manipulation of 3-D RF imagery and on-wall marking of detected structure
US10209357B2 (en) * 2016-04-28 2019-02-19 Fluke Corporation RF in-wall image registration using position indicating markers
US10564116B2 (en) 2016-04-28 2020-02-18 Fluke Corporation Optical image capture with position registration and RF in-wall composite image
US10302793B2 (en) 2016-08-04 2019-05-28 Fluke Corporation Blending and display of RF in wall imagery with data from other sensors
US10444344B2 (en) 2016-12-19 2019-10-15 Fluke Corporation Optical sensor-based position sensing of a radio frequency imaging device
GB2575561B (en) 2017-03-31 2022-06-08 FLIR Belgium BVBA Visually correlated radar systems and methods
CN107526078A (en) * 2017-09-29 2017-12-29 蒙城县永腾微行掌智能科技有限责任公司 A kind of life-detection system based on unmanned plane
US10297152B1 (en) * 2017-10-27 2019-05-21 Waymo Llc Displaying sensor data and supplemental data as a mask for autonomous vehicles
US11712637B1 (en) 2018-03-23 2023-08-01 Steven M. Hoffberg Steerable disk or ball
EP3553551B1 (en) * 2018-04-10 2022-06-01 Aptiv Technologies Limited Method for the recognition of an object
EP3553552B1 (en) 2018-04-11 2022-05-25 Aptiv Technologies Limited Method for the recognition of a moving pedestrian
EP3553559B1 (en) 2018-04-11 2022-06-01 Aptiv Technologies Limited Method for the recognition of objects
US11378678B2 (en) * 2018-08-07 2022-07-05 Rohde & Schwarz Gmbh & Co. Kg Method as well as system for determining the three-dimensional alignment of components of a radar system
TWI677698B (en) * 2018-11-16 2019-11-21 昇雷科技股份有限公司 Biometric detection method and biometric detection radar
US11099270B2 (en) * 2018-12-06 2021-08-24 Lumineye, Inc. Thermal display with radar overlay
US11076303B2 (en) * 2018-12-18 2021-07-27 Sunsight Holdings, Llc Antenna alignment tool generating earth browser file and related methods
WO2020154962A1 (en) * 2019-01-30 2020-08-06 深圳市大疆创新科技有限公司 Target credibility determination method, target recognition method and system, vehicle, and storage medium
US11073610B2 (en) 2019-01-31 2021-07-27 International Business Machines Corporation Portable imager
US10451712B1 (en) * 2019-03-11 2019-10-22 Plato Systems, Inc. Radar data collection and labeling for machine learning
JP7296261B2 (en) * 2019-06-21 2023-06-22 パナソニックホールディングス株式会社 Monitoring device and monitoring method
CN111025256A (en) * 2019-12-26 2020-04-17 湖南华诺星空电子技术有限公司 Method and system for detecting weak vital sign signals of airborne radar
CN113364969B (en) * 2020-03-06 2023-05-12 华为技术有限公司 Imaging method of non-line-of-sight object and electronic equipment
CN113364970B (en) * 2020-03-06 2023-05-19 华为技术有限公司 Imaging method of non-line-of-sight object and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994016340A1 (en) * 1992-12-31 1994-07-21 Rockwell International Corporation Computerised radar process for measuring distances and relative speeds between a vehicle and obstacles in front of it
KR100331785B1 (en) * 1999-12-21 2002-04-17 이돈신 Automatic tracking antenna for satellite communication used in moving thing such as ship
KR20080056543A (en) * 2006-12-18 2008-06-23 황원 Satellite tracking antenna system based on the mobile of objects and method thereof
US20090135045A1 (en) * 2007-11-28 2009-05-28 Camero-Tech Ltd. Through-the-obstacle radar system and method of operation

Family Cites Families (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3010102A (en) * 1947-07-05 1961-11-21 Bell Telephone Labor Inc Combination radar and thermalenergy detection system
US3108270A (en) * 1954-12-23 1963-10-22 North American Aviation Inc Interlocked radar and infrared detector system
US3755812A (en) * 1967-12-29 1973-08-28 Texas Instruments Inc Moving and hard target indicator
US3644043A (en) * 1969-08-11 1972-02-22 Hughes Aircraft Co Integrated infrared-tracker-receiver laser-rangefinder target search and track system
US5268680A (en) * 1970-09-08 1993-12-07 Raytheon Company Combined infrared-radar detection system
US3592122A (en) * 1971-06-26 1971-07-13 Texaco Inc Gas dispersal unit
US3787857A (en) * 1972-05-18 1974-01-22 Sperry Rand Corp Dual mode radiometric system
US3981010A (en) * 1972-07-03 1976-09-14 Rmc Research Corporation Object locating system
US4062018A (en) * 1973-12-21 1977-12-06 Kokusai Denshin Denwa Kabushiki Kaisha Scanning antenna with moveable beam waveguide feed and defocusing adjustment
US3949404A (en) * 1974-12-19 1976-04-06 Nasa Highly efficient antenna system using a corrugated horn and scanning hyperbolic reflector
US4050068A (en) * 1976-03-15 1977-09-20 The United States Of America As Represented By The Secretary Of The Air Force Augmented tracking system
US4642641A (en) * 1976-07-29 1987-02-10 Westinghouse Electric Corp. Variable PRF target tracking radar
GB1603657A (en) * 1977-09-13 1981-11-25 Marconi Co Ltd Systems for the transmission and/or reception of electromagnetic waves
US4160251A (en) * 1977-12-19 1979-07-03 Sperry Rand Corporation Hybrid dual mode radiometric system
US5182564A (en) * 1984-07-26 1993-01-26 The Boeing Company Guidance apparatus with dual mode sensor
US4652885A (en) * 1985-03-04 1987-03-24 The United States Of America As Represented By The Secretary Of The Army Dual mode antenna for millimeter wave and infrared radiation
EP0205794B1 (en) * 1985-05-23 1990-08-29 Contraves Ag Search system for space/air space surveillance
JPS63255683A (en) * 1987-04-13 1988-10-21 Hitachi Ltd Apparatus for imaging extraneous substance
NL8802289A (en) * 1988-09-16 1990-04-17 Hollandse Signaalapparaten Bv TOUR SEARCH SENSOR.
US5097268A (en) * 1991-01-08 1992-03-17 Raytheon Company Radar apparatus and method for enhancing the display of moving targets
US5280287A (en) * 1992-06-30 1994-01-18 The United States Of America As Represented By The Secretary Of The Navy Coded identification and positioning system
US5456157A (en) * 1992-12-02 1995-10-10 Computing Devices Canada Ltd. Weapon aiming system
US5300933A (en) * 1993-02-24 1994-04-05 Daniel H. Wagner Associates, Inc. Stick figure radar tracking process
US5365237A (en) * 1993-05-13 1994-11-15 Thermo Trex Corporation Microwave camera
DE4413916A1 (en) * 1994-04-21 1995-11-02 Bodenseewerk Geraetetech Passive friend / foe discrimination facility
US5446461A (en) * 1994-04-28 1995-08-29 Hughes Missile Systems Company Concrete penetrating imaging radar
US5835054A (en) * 1996-03-01 1998-11-10 The Regents Of The University Of California Ultra wideband ground penetrating radar imaging of heterogeneous solids
US5781146A (en) * 1996-03-11 1998-07-14 Imaging Accessories, Inc. Automatic horizontal and vertical scanning radar with terrain display
US6359582B1 (en) * 1996-09-18 2002-03-19 The Macaleese Companies, Inc. Concealed weapons detection system
EP0899580A3 (en) * 1997-08-29 2000-02-02 Kabushiki Kaisha Toshiba Target locating system and approach guidance system
US5952957A (en) * 1998-05-01 1999-09-14 The United States Of America As Represented By The Secretary Of The Navy Wavelet transform of super-resolutions based on radar and infrared sensor fusion
DE19838246C2 (en) * 1998-08-22 2001-01-04 Daimler Chrysler Ag Bispectral window for a reflector and reflector antenna with this bispectral window
GB9819064D0 (en) * 1998-09-02 1998-10-28 Secr Defence Scanning apparatus
US7365672B2 (en) * 2001-03-16 2008-04-29 Battelle Memorial Institute Detection of a concealed object
US6876322B2 (en) * 2003-06-26 2005-04-05 Battelle Memorial Institute Concealed object detection
US6466155B2 (en) * 2001-03-30 2002-10-15 Ensco, Inc. Method and apparatus for detecting a moving object through a barrier
GB2382250B (en) * 2001-08-03 2006-01-04 Furuno Electric Co Vehicle information display apparatus
US7042409B2 (en) * 2001-09-27 2006-05-09 The Boeing Company Method and apparatus for mounting a rotating reflector antenna to minimize swept arc
CA2359599A1 (en) * 2001-10-22 2003-04-22 Kyle J. Doerksen Positioning system for ground penetrating radar instruments
US6985837B2 (en) * 2001-11-01 2006-01-10 Moon Dennis A System presenting meteorological information using a browser interface
US20070030195A1 (en) * 2002-09-12 2007-02-08 L-3 Communications Cyterra Corporation Concealed object detection
US20040051659A1 (en) * 2002-09-18 2004-03-18 Garrison Darwin A. Vehicular situational awareness system
US7415244B2 (en) * 2003-08-12 2008-08-19 Trey Enterprises Corp. Multi-channel millimeter wave imaging system
WO2005086620A2 (en) * 2003-10-10 2005-09-22 L-3 Communications Security And Detection Systems Mmw contraband screening system
US7148836B2 (en) * 2004-03-05 2006-12-12 The Regents Of The University Of California Obstacle penetrating dynamic radar imaging system
US7180441B2 (en) * 2004-04-14 2007-02-20 Safeview, Inc. Multi-sensor surveillance portal
US20050231416A1 (en) * 2004-04-14 2005-10-20 Rowe Richard L Relational millimeter-wave interrogating
US6972714B1 (en) * 2004-06-08 2005-12-06 Agilent Technologies, Inc. Optically-augmented microwave imaging system and method
US7253766B2 (en) * 2004-09-24 2007-08-07 Battelle Memorial Institute Three-dimensional surface/contour processing based on electromagnetic radiation interrogation
US7145506B2 (en) * 2005-01-21 2006-12-05 Safeview, Inc. Depth-based surveillance image reconstruction
US7161525B1 (en) * 2005-02-22 2007-01-09 Rockwell Collins, Inc. Turbulence display presentation
US8362942B2 (en) * 2005-04-14 2013-01-29 L-3 Communications Cyterra Corporation Moving-entity detection
US7940206B2 (en) * 2005-04-20 2011-05-10 Accipiter Radar Technologies Inc. Low-cost, high-performance radar networks
US7339516B2 (en) * 2005-08-11 2008-03-04 Realtronics Corporation Method to provide graphical representation of Sense Through The Wall (STTW) targets
EP1785743B1 (en) * 2005-11-09 2011-10-05 Saab Ab Multi-Sensor System
US7508342B2 (en) * 2005-11-18 2009-03-24 The Boeing Company Satellite antenna positioning system
US20070139248A1 (en) * 2005-12-16 2007-06-21 Izhak Baharav System and method for standoff microwave imaging
US7492303B1 (en) * 2006-05-09 2009-02-17 Personnel Protection Technologies Llc Methods and apparatus for detecting threats using radar
US8884763B2 (en) * 2006-10-02 2014-11-11 iRobert Corporation Threat detection sensor suite
US7504993B2 (en) * 2006-10-12 2009-03-17 Agilent Technolgoies, Inc. Coaxial bi-modal imaging system for combined microwave and optical imaging
US7474254B2 (en) * 2007-01-16 2009-01-06 Innovonix, Llc. Radar system with agile beam steering deflector
US7804442B2 (en) * 2007-01-24 2010-09-28 Reveal Imaging, Llc Millimeter wave (MMW) screening portal systems, devices and methods
GB0701869D0 (en) * 2007-01-31 2007-03-14 Cambridge Consultants Adaptive radar
US7781717B2 (en) * 2007-04-27 2010-08-24 Brijot Imaging Systems, Inc. System and method for manipulating real-time video playback time-synchronized with millimeter wave imagery
US8116520B2 (en) * 2007-06-20 2012-02-14 Microsemi Corporation System and method for overlaying computer generated highlights in a display of millimeter wave imagery
US7873182B2 (en) * 2007-08-08 2011-01-18 Brijot Imaging Systems, Inc. Multiple camera imaging method and system for detecting concealed objects
US20090292468A1 (en) * 2008-03-25 2009-11-26 Shunguang Wu Collision avoidance method and system using stereo vision and radar sensor fusion
US7701379B2 (en) * 2008-08-26 2010-04-20 Tialinx, Inc. Motion compensation for radar imaging
US20100194628A1 (en) * 2009-02-05 2010-08-05 Honeywell International Inc. Systems and methods for displaying radar-measured turbulence intensity on a vertical display
US20110084868A1 (en) * 2009-10-08 2011-04-14 Brijot Imaging Systems, Inc. Variable range millimeter wave method and system
GB2488699B (en) * 2009-11-03 2014-09-17 Vawd Applied Science & Technology Corp Standoff range sense through obstruction radar system
DE102010010912A1 (en) * 2010-03-10 2010-12-02 Daimler Ag Driver assistance device for vehicle, has sensor unit for detecting object in surrounding of vehicle and display unit for optical representation of detected object by sensor unit to schematic top view of vehicle
US20130076556A1 (en) * 2011-09-26 2013-03-28 United States Government, As Represented By The Secretary Of The Navy Active differential reflectometry

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994016340A1 (en) * 1992-12-31 1994-07-21 Rockwell International Corporation Computerised radar process for measuring distances and relative speeds between a vehicle and obstacles in front of it
KR100331785B1 (en) * 1999-12-21 2002-04-17 이돈신 Automatic tracking antenna for satellite communication used in moving thing such as ship
KR20080056543A (en) * 2006-12-18 2008-06-23 황원 Satellite tracking antenna system based on the mobile of objects and method thereof
US20090135045A1 (en) * 2007-11-28 2009-05-28 Camero-Tech Ltd. Through-the-obstacle radar system and method of operation

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110618465A (en) * 2018-06-04 2019-12-27 富士通株式会社 Article detection method and apparatus
CN110618465B (en) * 2018-06-04 2021-07-06 富士通株式会社 Article detection method and apparatus

Also Published As

Publication number Publication date
GB2488699B (en) 2014-09-17
IL219519A (en) 2015-08-31
WO2011056730A3 (en) 2011-08-04
GB2488699A (en) 2012-09-05
TR201205754T1 (en) 2012-09-21
US8791852B2 (en) 2014-07-29
GB201209474D0 (en) 2012-07-11
IL219519A0 (en) 2012-06-28
US20110102234A1 (en) 2011-05-05
US20140022118A1 (en) 2014-01-23

Similar Documents

Publication Publication Date Title
US8791852B2 (en) Standoff range sense through obstruction radar system
US10877130B2 (en) Drone detection radar
US10816658B2 (en) Radar enabled weapon detection system
US7148836B2 (en) Obstacle penetrating dynamic radar imaging system
Borek An overview of through the wall surveillance for homeland security
US6633254B1 (en) Self-modulating remote moving target detector
EP3156815B1 (en) Radar device and transmission-signal control method
JPH1114749A (en) Radar device
US8044839B2 (en) Combined radar and communications link
US7633426B2 (en) Combined radar and communications link
CN103529438A (en) System and method for monitoring moving target on ground by marine radar
KR20150100200A (en) Apparatus for detecting multi-target of unmanned security monitoring system
Giannì et al. Obstacle detection system involving fusion of multiple sensor technologies
EP4179359A2 (en) Through-wall radar sensors networked together to create 2d and 3d combined views of an area
WO2021087706A1 (en) Radar system, movable platform and radar system control method
RU2522910C2 (en) Automatic navigation radar with longer non-supervised self-contained operating period
Ghoghre et al. Radar system using arduino
RU2480787C1 (en) Method and system for remote detection of objects
EP3690478A1 (en) Radar device
KR101690781B1 (en) Method for Configuring Region of Interest of Radar Monitoring System and Apparatus Therefor
RU71781U1 (en) PASSIVE RADAR COMPLEX
RU2615988C1 (en) Method and system of barrier air defence radar detection of stealth aircraft based on gsm cellular networks
Borek et al. Through-the-wall surveillance for homeland security and law enforcement
RU2718954C1 (en) Matrix radiolocation station for area protection
JP2020030159A (en) Radar system and synchronization method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10828943

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 219519

Country of ref document: IL

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 1209474

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20101029

WWE Wipo information: entry into national phase

Ref document number: 1209474.4

Country of ref document: GB

122 Ep: pct application non-entry in european phase

Ref document number: 10828943

Country of ref document: EP

Kind code of ref document: A2