US20030140775A1 - Method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set - Google Patents
Method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set Download PDFInfo
- Publication number
- US20030140775A1 US20030140775A1 US10/060,565 US6056502A US2003140775A1 US 20030140775 A1 US20030140775 A1 US 20030140775A1 US 6056502 A US6056502 A US 6056502A US 2003140775 A1 US2003140775 A1 US 2003140775A1
- Authority
- US
- United States
- Prior art keywords
- dimensional data
- targeting
- dimensional
- data set
- storage medium
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 230000008685 targeting Effects 0.000 title claims abstract description 41
- 230000002596 correlated effect Effects 0.000 claims abstract description 5
- 238000012545 processing Methods 0.000 claims description 17
- 230000003287 optical effect Effects 0.000 claims description 11
- 238000007781 pre-processing Methods 0.000 claims description 9
- 230000002093 peripheral effect Effects 0.000 claims description 4
- 230000009471 action Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 description 16
- 230000011218 segmentation Effects 0.000 description 13
- 238000000605 extraction Methods 0.000 description 8
- 230000008901 benefit Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- NIOPZPCMRQGZCE-WEVVVXLNSA-N 2,4-dinitro-6-(octan-2-yl)phenyl (E)-but-2-enoate Chemical compound CCCCCCC(C)C1=CC([N+]([O-])=O)=CC([N+]([O-])=O)=C1OC(=O)\C=C\C NIOPZPCMRQGZCE-WEVVVXLNSA-N 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000002920 hazardous waste Substances 0.000 description 3
- 230000000875 corresponding effect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000004321 preservation Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 210000004209 hair Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000010809 targeting technique Methods 0.000 description 1
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/06—Aiming or laying means with rangefinder
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/14—Indirect aiming means
- F41G3/16—Sighting devices adapted for indirect laying of fire
- F41G3/165—Sighting devices adapted for indirect laying of fire using a TV-monitor
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/22—Aiming or laying means for vehicle-borne armament, e.g. on aircraft
- F41G3/225—Helmet sighting systems
Definitions
- the present invention is directed to sighting and targeting systems and, more particularly, to a method and apparatus for targeting a controlled system from a common three-dimensional data set.
- Some sighting and targeting techniques employ a single set of two-dimensional data.
- a foot soldier may sight and target a firearm using the same data, i.e., the two-dimensional data available from, e.g., an infrared scope.
- this process enjoys some of the benefits available from using a common set of data for both sighting and targeting, it suffers inherently from the lack of three-dimensional data. For instance, the soldier must still manually adjust for range and windage because the data lacks information regarding the range. The process also exposes the soldier to enemy fire, which is generally considered undesirable.
- the present invention is directed to resolving, or at least reducing, one or all of the problems mentioned above.
- the invention in its various aspects, is a method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set. It includes a method, comprising sighting a position correlated to at least a subset of a three-dimensional data set representing a field of view; and targeting a controlled system to the position from the three-dimensional data set. It also includes an apparatus comprising a program storage medium, a controller, and a controller interface.
- the program storage medium is capable of storing a three-dimensional data set representing a field of view.
- the controller is capable of generating a presentation of the three-dimensional data set. A position represented by at least a subset of the three-dimensional data can be sighted and the position can be targeted from the subset through the controller interface.
- FIG. 1 conceptually illustrates a controlled system with which the present invention can be implemented to sight and target a position within a field of view;
- FIG. 2 is a conceptual block diagram of selected portions of the controlled system first shown in FIG. 1;
- FIG. 3 is a block diagram illustrating how selected portions of a controlled system, such as that shown in FIG. 2, may be implemented;
- FIG. 4 illustrates one particular embodiment of the present invention employed in a military context
- FIG. 5 is a block diagram depicting a computing apparatus such as may be used in implementing the embodiment in FIG. 4;
- FIG. 6 illustrates a second embodiment alternative to that in FIG. 4;
- FIG. 7 illustrates one particular implementation of the scenario set forth in FIG. 4;
- FIG. 8 is a block diagram of one particular implementation of the controlled system first illustrated in FIG. 4.
- FIG. 9 depicts the handling of three-dimensional data acquired in the scenario in FIG. 7.
- FIG. 1 conceptually illustrates a controlled system 100 with which the present invention can be implemented to sight and target a position 110 within a field of view 120 .
- the field of view 120 comprises the volume defined by the propagation boundaries 122 and the plane 124 in accordance with conventional practice.
- the plane 124 represents the maximum range of the data acquisition mechanism (not shown in FIG. 1) used to gather three-dimensional data concerning the content of the field of view 120 .
- FIG. 2 is a block diagram of one implementation 200 of the controlled system 100 in FIG. 1.
- the controlled system 200 implements a fire control system in a military context using the present invention.
- the controlled system 200 includes a direct diode LADAR system 210 , a weapon platform 220 , a digital processor 230 , and a helmet mounted display 240 .
- the direct diode LADAR system 210 transmits laser light pulses that reflect from the target 250 back to the direct diode LADAR system 210 .
- the digital processor 230 extracts three-dimensional data from the reflected light received by the direct diode LADAR system 210 , and processes it to present a three-dimensional image (not shown) on the helmet mounted display 240 .
- the digital processor 230 also receives position information from the weapon platform 220 .
- a user then sights the controlled system at a position, e.g., the position 110 in FIG. 1, portrayed in the three-dimensional image.
- the sighted position 110 is the position of the target 250 .
- the digital processor 230 receives the sighting information from the helmet-mounted display 240 , and targets the position by issuing pointing commands to the weapon platform 220 .
- the targeting commands are formulated from the sighting information and the three-dimensional data extracted from the reflected light.
- the sighting and targeting in the controlled system 200 are performed from a common set of three-dimensional data.
- the sighting and targeting a priori account for the range to the target 250 .
- the target 250 is sighted from an image and the targeting is performed by the digital processor 230 , the soldier is not necessarily exposed to enemy fire.
- the ability to sight from the three-dimensional image also means that the soldier can be located remotely from the weapon platform 220 in some implementations. Such an arrangement would provide an additional degree of safety for the soldier.
- Another advantage of employing a common set of three-dimensional data is that an impact point projected from the issued targeting commands can be displayed to the user for, e.g., confirmation.
- the invention admits wide variation in implementation. Many of these variations may be realized in the narrow context of a military application. For instance, even in the context of a weapon system such as the controlled system 200 in FIG. 2, many different types of LADAR systems, displays, processors, and weapon platforms might be employed in various alternative embodiments. Many of these variations might also arise from the nature of the application, as the invention has many civilian applications as well. For instance, many civilian applications such as hazardous waste disposal or remote surveillance would better benefit from a remote display instead of a helmet-mounted display. Some of these kinds of variations are explored below.
- FIG. 3 is a block diagram illustrating how selected portions of a controlled system 300 , such as the controlled system 200 in FIG. 2, may be implemented.
- the controlled system 300 comprises a data acquisition subsystem 305 ; an apparatus, which, in the context of the controlled system 300 , may be, referred to as a sighting and targeting subsystem 310 , and a control subsystem 320 .
- the data acquisition subsystem 305 acquires a three-dimensional data set representing the field of view 120 (shown in FIG. 1) and its contents, including the position 110 (also shown in FIG. 1).
- the data acquisition may include, e.g., the direct diode LADAR 210 of FIG. 2.
- the data in the three-dimensional data set typically is measured in a spherical coordinate system (e.g., range, elevation, and azimuth) but other coordinate systems (e.g., Cartesian) may be employed.
- a user then interacts with the sighting and targeting subsystem 310 to sight and target the position 110 as previously discussed relative to FIG. 2.
- the control subsystem 330 then implements instructions from the sighting and targeting subsystem 320 to implement the targeting.
- the three-dimensional data set acquired by the data acquisition subsystem 300 is stored in a program storage medium 312 .
- the program storage medium may be of any suitable type known to the art. For instance, it may be implemented in a magnetic (e.g., a floppy disk or a hard drive), or optical (e.g., a compact disk read only memory, or “CD ROM”) medium and may be read only or random access.
- CD ROM compact disk read only memory
- the program storage medium 312 will generally be implemented in a magnetic, random access medium.
- the three-dimensional data may be stored by encoding it in any suitable data structure on the program storage medium 312 .
- a controller 314 then operates on the three-dimensional data set to process it for a controller interface 316 .
- the controller 314 may be any suitable data processing device, e.g., a digital signal processor (“DSP”) or a microprocessor, such as the digital processor 230 in FIG. 2.
- DSP digital signal processor
- the controller interface 316 will typically present the three-dimensional data set to the user by displaying it as a three-dimensional image.
- the controller interface 316 will therefore typically process the three-dimensional data into a video data set for display as a three dimensional image.
- the controller interface 316 might also perform a variety of preprocessing activities associated with this type of processing. For instance, the controller 314 might fuse the three-dimensional data set with two-dimensional data regarding the field of view 120 (shown in FIG. 1) acquired in addition to the three-dimensional data set.
- Many techniques for this type of processing and pre-processing are known to the art and any such technique suitable to the particular implementation may be employed.
- the controller interface 316 typically presents the three-dimensional data set as a three-dimensional image.
- the controller interface 316 will therefore typically include a video display (not shown in FIG. 3) of some kind that may be rack-mounted or part of a heads-up display (“HUD”), such as the helmet-mounted display 240 in FIG. 2.
- the controller interface 316 may also include one or more peripheral input/output (“I/O”) devices (also not shown in FIG. 3), such as a keyboard, a mouse, and a joystick.
- I/O peripheral input/output
- the video display could include, for instance, a touch screen such that the user can input directly without the aid of peripheral I/O devices.
- the invention includes an apparatus that may be employed as a subsystem (e.g., the sighting and targeting subsystem 310 ) in a controlled system (e.g., the controlled system 300 ).
- This apparatus comprises a program storage medium (e.g., the program storage medium 312 ), a controller (e.g., the controller 314 ), and a controller interface (e.g., the controller interface 316 ).
- the program storage medium when the apparatus is employed in accordance with the present invention, stores a three-dimensional data set representing a field of view (e.g., the field of view 120 ).
- the controller generates a presentation of the three-dimensional data set.
- a user then sights and targets a position in that field of view from a presentation of the three-dimensional data by indicating a subset of that presentation.
- This aspect further includes a method in which the user sights a position correlated to a subset of the three-dimensional data set representing the field of view and targets a controlled system to the position from the three-dimensional data set.
- a vehicle 410 includes a weapon platform and is equipped and operated in accordance with the present invention.
- the occupant (not shown) of the vehicle 410 is interested in destroying, or at least incapacitating, the lead tank 420 in an advancing column within the field of view 430 .
- the occupant employs the present invention to sight and target the weapons system (also not shown) on the position 440 of the lead tank 420 . Once the weapons system is targeted, it can be fired.
- the vehicle 410 is equipped with a laser-radar (“LADAR”)—based data acquisition system (not shown) that paints the field of view 420 with a number of laser pulses (also not shown).
- LADAR laser-radar
- the laser pulses propagate through the field of view 430 until they encounter an object (e.g., the tank 420 , a tree, or the ground) and are reflected back to the vehicle 410 . From the reflected pulses, the data acquisition system extracts a three-dimensional data set. Any suitable LADAR system known to the art may be employed.
- the vehicle 410 is also equipped with a rack-mounted computing apparatus 500 , conceptually illustrated in FIG. 5.
- the computing apparatus includes a processor 505 communicating with some storage 510 over a bus system 515 .
- the storage 510 may include a hard disk and/or RAM and/or removable storage such as the floppy magnetic disk 515 and the optical disk 520 .
- the storage 510 is encoded with a data structure 525 storing the three-dimensional data set acquired as discussed above.
- the storage 510 is one implementation of the program storage medium 312 (shown in FIG. 3).
- the storage 510 is also encoded with an operating system 530 and some user interface software 535 that, in conjunction with the display 540 , constitute a user interface 545 .
- the user interface 545 is one implementation of the controller interface 316 in FIG. 3.
- the display 540 may be a touch screen allowing the user to input directly into the computing apparatus.
- the user interface 545 may include peripheral I/O devices such as the keyboard 550 , the mouse 555 , or the joystick 560 , for use with other types of displays, e.g., a HUD.
- the processor 505 runs under the control of the operating system 530 , which may be practically any operating system known to the art.
- the processor 505 under the control of the operating system 530 , invokes the user interface software 535 on startup so that the operator can control the computing apparatus 500 .
- the storage 510 is also encoded with an application 565 invoked by the processor 505 under the control of the operating system 530 or by the user through the user interface 545 .
- the application 565 when executed by the processor 505 , performs any processing or pre-processing on the three-dimensional data stored in the data structure 525 .
- the application 565 also displays the three-dimensional image 450 of the field of view 430 (shown in FIG. 4), or a portion thereof, on the display 540 .
- the three-dimensional image 450 may be presented to a passenger of the controlled system 410 or to a remote user.
- the user then sights the weapons subsystem (not shown) by indicating a subset 460 of the three-dimensional image 450 .
- the manner in which this designation occurs will be implementation specific depending upon the manner in which I/O is to occur, e.g., by touching a touch screen or designating with a mouse or joystick.
- the invention admits wide variation in many aspects.
- the invention is not limited to ground-based or stationary controlled systems and the user may be local or remote relative to the controlled system.
- the controlled system is implemented as a flying submunition 610 and the target is a moving ship 620 .
- the flying submunition 610 transmits laser pulses 630 that are reflected from the ship 620 .
- the three dimensional data is extracted from the returned signals 640 , and either a three-dimensional data or a three-dimensional image is transmitted to a remote user aboard, for instance, an aircraft 650 . This information is transmitted by electromagnetic signals 660 .
- the three-dimensional image is then displayed to the remote user, who then makes the designation to sight the flying submunition 610 on the ship 620 .
- the designation is transmitted by the electromagnetic signals 670 to the flying submunition 610 .
- the flying submunition 610 Upon receiving the designation, the flying submunition 610 then targets the ship 620 for destruction.
- alternative scenarios might be ground-to-air or air-to-air scenarios.
- FIG. 7 presents one implementation 700 of the scenario 400 shown in FIG. 4.
- the implementation 700 is modified from a data acquisition and target identification process first shown in:
- This patent discloses a method by which targets are identified from three-dimensional images generated from three-dimensional data.
- the present invention employs a three-dimensional data set such as the one obtained by this prior art method to both sight and target the controlled system, i.e., a weapon platform in this scenario.
- a LADAR system scans a target scene to provide on-site a three-dimensional image (representation) of the target scene. This image is processed to detect and segment potential targets. The segmentations representing these potential targets are then further processed by feature extracting and classification processes to identify the target. The segmentations of targets of interest previously completed prior to feature extraction and classification are either, or both, displayed locally or transmitted to a remote site for display. Because only the segmented targets rather than the entire scene is transmitted, this process allows communications over data links of limited bandwidth. A position within this segmented target image may then be sighted from this segmented three-dimensional image. A weapon system can then be targeted using the same three-dimensional data set.
- the system 710 includes a vehicle 714 for housing a weapon platform 715 , a transmitter and sensor platform 716 and a processing center 717 .
- the platform 716 includes a conventional LADAR system that generates and directs a laser beam to scan the target scene 712 , including the targets. Reflected laser light is detected by the platform 716 , and the processing center 717 processes the reflected light in a conventional manner into a scan data representative of the target scene 712 .
- a Global Positioning System (“GPS”) transmitter 718 transmits a signal for providing accurate position data for the vehicle 714 .
- GPS Global Positioning System
- the system 710 processes three-dimensional LADAR images of the target scene 712 in the processing center 717 , manipulates the resulting data into packets of information containing a segment of the scene, and transmits these packets of information by a communications link 720 to a remote site 722 for display on a display 724 .
- the remote site 722 may be, for example, a combat platform or control and command node at a remote site and has access to data characterizing a local scene.
- FIG. 8 depicts one embodiment of the sensor platform 716 and processing platform 717 housed on the vehicle 714 in FIG. 7.
- the sensor platform 716 supports a GPS receiver 830 disposed to receive a signal 833 from the GPS transmitter 718 in FIG. 7.
- a thermal imager 832 provides passive field of view search capability of the target scene 712 in FIG. 7.
- a LADAR sensor 834 generates a scan signal representative of an image of the target scene 712 in FIG. 7 by scanning a laser beam across the scene 712 and detecting reflections of the laser beam as it scans the scene 712 .
- a suitable system is disclosed in:
- the processing center 717 includes a digital processor 836 for processing the scan signal into three-dimensional LADAR images.
- a data manipulator 838 is provided for manipulating selected three-dimensional LADAR image data into packets of information which may be displayed by a local display terminal 840 at the vehicle 714 (shown in FIG. 7) for on-site display of the target 721 .
- a data transmitter 842 is also provided for transmitting the packets of segmented information over the limited bandwidth communications link 720 in FIG. 7.
- the vehicle 714 in FIG. 7 maneuvers into position to survey the target scene 712 in FIG. 7.
- the position of the vehicle 714 is read from the GPS receiver 830 housed on the sensor platform 716 in FIG. 7.
- the target scene 712 in FIG. 7 is scanned with the LADAR sensor 834 which is aligned with a compass providing a north reference.
- the LADAR sensor 834 collects data from the scanned target scene 712 in FIG. 7 and generates scan data representative of a three-dimensional image.
- the digital processor 836 of processing center 717 further processes the scan data. This processing generally involves initially representing detected signals as data elements in a spherical coordinate system, wherein each data element includes a range value and an intensity value that correspond to a point on the target scene 712 in FIG. 7.
- the processor 836 then converts these data elements into a row-column format where the row-column represents the two angles in the spherical coordinate system and the data element is the range.
- the digital processor 836 initially processes the scan data into a three-dimensional LADAR image according to a spherical coordinate system of some type, which has an origin that coincides with the location of the LADAR sensor's detecting optics. This may be performed in accordance with known techniques.
- the spherical coordinate system is convenient in generating the three-dimensional images since the angular position of a point in the target scene 712 in FIG. 7 may be measured with respect to axes that coincide with the axes of rotation of the LADAR sensor's detecting optics during scanning of the target scene 712 in FIG. 7. Moreover, the spherical coordinate system is conducive to storing the range of a point in the target scene 712 in FIG. 7, since this range corresponds to a radius from the LADAR sensor's detecting optics to the point. Each data element also includes an intensity value, representative of the intensity of the reflected light. Additionally, each data element includes an azimuth angle and an elevation angle. As indicated, this three-dimensional LADAR image is stored by the processor 836 in a row-column format for later use.
- FIG. 9 illustrates the handling of the three-dimensional data set acquired as discussed immediately above.
- the LADAR three-dimensional data in row column format (at 950 ) is further processed by the digital processor 836 or, alternatively, by an off-site processor (not shown) such as a personal computer, a mini-computer, or other suitable computing device.
- This further processing generally involves preprocessing (at 952 ), detection (at 954 ), segmentation (at 956 ), feature extraction (at 958 ), and classification (at 960 ).
- the preprocessing (at 952 ) is directed to minimizing noise effects, such as identifying so-called intensity dropouts in the converted three-dimensional image, where the range value of the LADAR image is set to zero.
- Noise in the converted three-dimensional LADAR image introduced by low signal-to-noise ratio (“SNR”) conditions is processed so that performance of the overall system 10 is not degraded.
- the converted LADAR image signal is used so that absolute range measurement distortion is minimized, edge preservation is maximized, and preservation of texture step (that results from actual structure in objects being imaged) is maximized.
- detection identifies specific regions of interest in the three-dimensional LADAR image.
- the detection uses range cluster scores as a measure to locate flat, vertical surfaces in an image. More specifically, a range cluster score is computed at each pixel to determine if the pixel lies on a flat, vertical surface. The flatness of a particular surface is determined by looking at how many pixels are within a given range in a small region of interest. The given range is defined by a threshold value that can be adjusted to vary performance. For example, if a computed range cluster score exceeds a specified threshold value, the corresponding pixel is marked as a detection. If a corresponding group of pixels meets a specified size criteria, the group of pixels is referred to as a region of interest. Regions of interest, for example those regions containing one or more targets, are determined and passed to a segmenter for further processing.
- Segmentation (at 956 ) determines, for each detection of a target, which pixels in a region of interest belong to the detected target and which belong to the detected target's background. Segmentation (at 956 ) identifies possible targets, for example, those whose connected pixels exceed a height threshold above the ground plane. More specifically, the segmentation (at 956 ) separates target pixels from adjacent ground pixels and the pixels of nearby objects, such as bushes and trees.
- Feature extraction provides information about a segmentation (at 956 ) so that the target and its features in that segmentation can be classified.
- Features include, for example, orientation, length, width, height, radial features, turret features, and moments.
- the feature extraction (at 958 ) also typically compensates for errors resulting from segmentation (at 956 ) and other noise contamination.
- Feature extraction (at 958 ) generally determines a target's three-dimensional orientation and size and a target's size. The feature extraction (at 958 ) also distinguishes between targets and false alarms and between different classes of targets.
- Classification classifies segmentations to contain particular targets, usually in a two stage process. First, features such as length, width, height, height variance, height skew, height kurtosis, and radial measures are used to initially discard non-target segmentations. The segmentations that survive this step are then matched with true target data stored in a target database.
- the data in the target database for example, may include length, width, height, average height, hull height, and turret height to classify a target.
- the classification is performed using known methods for table look-ups and comparisons.
- Data obtained from the segmentation (at 956 ), the feature extraction (at 958 ), and the classification (at 960 ) is assembled into a packet of information (not shown).
- the packet may be rapidly and accurately transmitted to the remote site 722 in FIG. 7 and displayed in one of a variety of user-selectable formats.
- Typical formats include a three-view commonly used by armed forces to identify targets during combat, a north reference plan view, or a rotated perspective. These display options available to the operator, either local or remote, are based on the three-dimensional nature of the LADAR image.
- the results of the feature extraction (at 958 ) provide target information including orientation, length, width and height.
- the target image can be displayed from any perspective, independent of the sensor perspective, and the operator can select one of the several display formats that utilize the adjustable perspective.
- the data processed by the method 900 in FIG. 9 is transmitted as described over the communication link 720 to the remote site 722 and displayed on the remote display 724 .
- the user can sight the weapon platform 715 on the target 721 (shown in FIG. 7) by indicating a point on the displayed image. Note that the sighting can also be performed in this manner on the local display 840 , if desired.
- the sighted position is transmitted back to the processing center 717 , whereupon the digital processor 836 issues pointing commands to the weapon platform 715 .
- the control subsystem 844 then implements the pointing commands to complete the targeting.
- the invention is described above relative to military fire control systems, the invention is not so limited.
- the above described invention makes it possible in a number of military and civilian applications to integrate sighting and targeting activities more accurately than with the state of the art.
- robotic tools used in hazardous waste cleanup may be operated more safely and efficiently since the sighting and targeting the robotic tool (e.g., to a hazardous waste deposit needing cleanup) are from a common, three-dimensional data set.
- Other civilian applications such as law enforcement, data for robotics force-fighting, crime deterrence, and border patrol functions may also benefit from the application of the current invention.
- the present invention is not limited to applications found in a military context.
Abstract
A method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set are disclosed. The method includes sighting a position correlated to at least a subset of a three-dimensional data set representing a field of view; and targeting a controlled system to the position from the three-dimensional data set. The apparatus includes a program storage medium, a controller, and a controller interface. The program storage medium is capable of storing a three-dimensional data set representing a field of view. The controller is capable of generating a presentation of the three-dimensional data set. A position represented by at least a subset of the three-dimensional data can be sighted and the position can be targeted from the subset through the controller interface.
Description
- 1. Field of the Invention
- The present invention is directed to sighting and targeting systems and, more particularly, to a method and apparatus for targeting a controlled system from a common three-dimensional data set.
- 2. Description of the Related Art
- Many human endeavors involve “sighting” and “targeting.” Although many of these endeavors are civilian, these terms are most commonly associated with weapons systems in a military context. For instance, consider the automatic fire control system disclosed in U.S. Pat. No. 4,004,729, issued Jan. 25, 1977, to Lockheed Electronics Co., Inc. as assignee of the inventors Harris C. Rawicz, et al. (“Rawicz et al.”). A gunner uses an optical sight to sight a weapons system on a target. A digital processor then manipulates a variety of factors, e.g., the velocity of the target and the target's position, to determine a projected position for the target. The digital processor then targets the weapon to fire at the target at its projected position.
- One important distinction between sighting and targeting in these types of systems is the nature of the data involved. Sighting is typically performed on two-dimensional data whereas targeting is performed on three dimensional data. Consider, again, the weapons control system in Rawicz, et al. The gunner sights the target by placing the cross hairs of the optical sight on the target. Thus, the gunner sights in azimuth (i.e., where on the horizon) and elevation (i.e., how far above the horizon). However, targeting also takes into account the range (i.e., how far away) of the target. Range is important because projectile trajectories and environmental conditions (e.g., windage) affect the targeting.
- Operating off two different sets of data can be disadvantageous for a number of reasons. Most of these reasons arise from the fact that two different data sets generally require two different acquisition systems. Returning to the weapons system of Rawicz et al., the digital processor determines the projected position using both the two-dimensional data (i.e. the current position indicated by the sighting) and the three-dimensional data (i.e., the velocity of the target). Thus, errors occurring in both the two-dimensional and the three-dimensional data acquisition infect the targeting. Two different acquisition systems also consume more physical space than does a single acquisition system, which can be a significant constraint in some demanding applications.
- Some sighting and targeting techniques employ a single set of two-dimensional data. A foot soldier may sight and target a firearm using the same data, i.e., the two-dimensional data available from, e.g., an infrared scope. However, while this process enjoys some of the benefits available from using a common set of data for both sighting and targeting, it suffers inherently from the lack of three-dimensional data. For instance, the soldier must still manually adjust for range and windage because the data lacks information regarding the range. The process also exposes the soldier to enemy fire, which is generally considered undesirable.
- The present invention is directed to resolving, or at least reducing, one or all of the problems mentioned above.
- The invention, in its various aspects, is a method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set. It includes a method, comprising sighting a position correlated to at least a subset of a three-dimensional data set representing a field of view; and targeting a controlled system to the position from the three-dimensional data set. It also includes an apparatus comprising a program storage medium, a controller, and a controller interface. The program storage medium is capable of storing a three-dimensional data set representing a field of view. The controller is capable of generating a presentation of the three-dimensional data set. A position represented by at least a subset of the three-dimensional data can be sighted and the position can be targeted from the subset through the controller interface.
- The invention may be understood by reference to the following description taken in conjunction with the accompanying drawings, in which like reference numerals identify like elements, and in which:
- FIG. 1 conceptually illustrates a controlled system with which the present invention can be implemented to sight and target a position within a field of view;
- FIG. 2 is a conceptual block diagram of selected portions of the controlled system first shown in FIG. 1;
- FIG. 3 is a block diagram illustrating how selected portions of a controlled system, such as that shown in FIG. 2, may be implemented;
- FIG. 4 illustrates one particular embodiment of the present invention employed in a military context;
- FIG. 5 is a block diagram depicting a computing apparatus such as may be used in implementing the embodiment in FIG. 4;
- FIG. 6 illustrates a second embodiment alternative to that in FIG. 4;
- FIG. 7 illustrates one particular implementation of the scenario set forth in FIG. 4;
- FIG. 8 is a block diagram of one particular implementation of the controlled system first illustrated in FIG. 4; and
- FIG. 9 depicts the handling of three-dimensional data acquired in the scenario in FIG. 7.
- While the invention is susceptible to various modifications and alternative forms, the drawings illustrate specific embodiments herein described in detail by way of example. It should be understood, however, that the description herein of specific embodiments is not intended to limit the invention to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
- The detailed description below illustrates exemplary embodiments of the invention. In the interest of clarity, not all features of an actual implementation are described in this specification. It will of course be appreciated that in the development of any such actual embodiment, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort, even if complex and time-consuming, would be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.
- Turning now to the drawings, FIG. 1 conceptually illustrates a controlled
system 100 with which the present invention can be implemented to sight and target aposition 110 within a field ofview 120. The field ofview 120 comprises the volume defined by thepropagation boundaries 122 and theplane 124 in accordance with conventional practice. Theplane 124 represents the maximum range of the data acquisition mechanism (not shown in FIG. 1) used to gather three-dimensional data concerning the content of the field ofview 120. - FIG. 2 is a block diagram of one
implementation 200 of the controlledsystem 100 in FIG. 1. The controlledsystem 200 implements a fire control system in a military context using the present invention. The controlledsystem 200 includes a direct diode LADARsystem 210, aweapon platform 220, adigital processor 230, and a helmet mounteddisplay 240. The direct diode LADARsystem 210 transmits laser light pulses that reflect from thetarget 250 back to the direct diode LADARsystem 210. Thedigital processor 230 extracts three-dimensional data from the reflected light received by the direct diode LADARsystem 210, and processes it to present a three-dimensional image (not shown) on the helmet mounteddisplay 240. Thedigital processor 230 also receives position information from theweapon platform 220. A user then sights the controlled system at a position, e.g., theposition 110 in FIG. 1, portrayed in the three-dimensional image. In the implementation of FIG. 2, thesighted position 110 is the position of thetarget 250. Thedigital processor 230 receives the sighting information from the helmet-mounteddisplay 240, and targets the position by issuing pointing commands to theweapon platform 220. The targeting commands are formulated from the sighting information and the three-dimensional data extracted from the reflected light. - Note that the sighting and targeting in the controlled
system 200 are performed from a common set of three-dimensional data. Thus, the sighting and targeting a priori account for the range to thetarget 250. Furthermore, because thetarget 250 is sighted from an image and the targeting is performed by thedigital processor 230, the soldier is not necessarily exposed to enemy fire. Indeed, the ability to sight from the three-dimensional image also means that the soldier can be located remotely from theweapon platform 220 in some implementations. Such an arrangement would provide an additional degree of safety for the soldier. Another advantage of employing a common set of three-dimensional data is that an impact point projected from the issued targeting commands can be displayed to the user for, e.g., confirmation. - Note that the invention admits wide variation in implementation. Many of these variations may be realized in the narrow context of a military application. For instance, even in the context of a weapon system such as the controlled
system 200 in FIG. 2, many different types of LADAR systems, displays, processors, and weapon platforms might be employed in various alternative embodiments. Many of these variations might also arise from the nature of the application, as the invention has many civilian applications as well. For instance, many civilian applications such as hazardous waste disposal or remote surveillance would better benefit from a remote display instead of a helmet-mounted display. Some of these kinds of variations are explored below. - FIG. 3 is a block diagram illustrating how selected portions of a controlled
system 300, such as the controlledsystem 200 in FIG. 2, may be implemented. The controlledsystem 300 comprises adata acquisition subsystem 305; an apparatus, which, in the context of the controlledsystem 300, may be, referred to as a sighting and targetingsubsystem 310, and acontrol subsystem 320. Generally, thedata acquisition subsystem 305 acquires a three-dimensional data set representing the field of view 120 (shown in FIG. 1) and its contents, including the position 110 (also shown in FIG. 1). The data acquisition may include, e.g., thedirect diode LADAR 210 of FIG. 2. The data in the three-dimensional data set typically is measured in a spherical coordinate system (e.g., range, elevation, and azimuth) but other coordinate systems (e.g., Cartesian) may be employed. A user (not shown) then interacts with the sighting and targetingsubsystem 310 to sight and target theposition 110 as previously discussed relative to FIG. 2. The control subsystem 330 then implements instructions from the sighting and targetingsubsystem 320 to implement the targeting. - More particularly, the three-dimensional data set acquired by the
data acquisition subsystem 300 is stored in aprogram storage medium 312. The program storage medium may be of any suitable type known to the art. For instance, it may be implemented in a magnetic (e.g., a floppy disk or a hard drive), or optical (e.g., a compact disk read only memory, or “CD ROM”) medium and may be read only or random access. However theprogram storage medium 312 will generally be implemented in a magnetic, random access medium. The three-dimensional data may be stored by encoding it in any suitable data structure on theprogram storage medium 312. - A
controller 314 then operates on the three-dimensional data set to process it for acontroller interface 316. Thecontroller 314 may be any suitable data processing device, e.g., a digital signal processor (“DSP”) or a microprocessor, such as thedigital processor 230 in FIG. 2. Thecontroller interface 316 will typically present the three-dimensional data set to the user by displaying it as a three-dimensional image. Thecontroller interface 316 will therefore typically process the three-dimensional data into a video data set for display as a three dimensional image. Thecontroller interface 316 might also perform a variety of preprocessing activities associated with this type of processing. For instance, thecontroller 314 might fuse the three-dimensional data set with two-dimensional data regarding the field of view 120 (shown in FIG. 1) acquired in addition to the three-dimensional data set. Many techniques for this type of processing and pre-processing are known to the art and any such technique suitable to the particular implementation may be employed. - Many details of the
controller interface 316 will be implementation specific. As was mentioned, thecontroller interface 316 typically presents the three-dimensional data set as a three-dimensional image. Thecontroller interface 316 will therefore typically include a video display (not shown in FIG. 3) of some kind that may be rack-mounted or part of a heads-up display (“HUD”), such as the helmet-mounteddisplay 240 in FIG. 2. Thecontroller interface 316 may also include one or more peripheral input/output (“I/O”) devices (also not shown in FIG. 3), such as a keyboard, a mouse, and a joystick. However, the video display could include, for instance, a touch screen such that the user can input directly without the aid of peripheral I/O devices. - Thus, in one aspect, the invention includes an apparatus that may be employed as a subsystem (e.g., the sighting and targeting subsystem310) in a controlled system (e.g., the controlled system 300). This apparatus comprises a program storage medium (e.g., the program storage medium 312), a controller (e.g., the controller 314), and a controller interface (e.g., the controller interface 316). The program storage medium, when the apparatus is employed in accordance with the present invention, stores a three-dimensional data set representing a field of view (e.g., the field of view 120). The controller generates a presentation of the three-dimensional data set. A user then sights and targets a position in that field of view from a presentation of the three-dimensional data by indicating a subset of that presentation. This aspect further includes a method in which the user sights a position correlated to a subset of the three-dimensional data set representing the field of view and targets a controlled system to the position from the three-dimensional data set.
- For a more concrete example, consider the
scenario 400 presented in FIG. 4. Thescenario 400 occurs in a military context, although the invention is not so limited. In thescenario 400, avehicle 410 includes a weapon platform and is equipped and operated in accordance with the present invention. The occupant (not shown) of thevehicle 410 is interested in destroying, or at least incapacitating, thelead tank 420 in an advancing column within the field ofview 430. The occupant employs the present invention to sight and target the weapons system (also not shown) on theposition 440 of thelead tank 420. Once the weapons system is targeted, it can be fired. - The
vehicle 410 is equipped with a laser-radar (“LADAR”)—based data acquisition system (not shown) that paints the field ofview 420 with a number of laser pulses (also not shown). The laser pulses propagate through the field ofview 430 until they encounter an object (e.g., thetank 420, a tree, or the ground) and are reflected back to thevehicle 410. From the reflected pulses, the data acquisition system extracts a three-dimensional data set. Any suitable LADAR system known to the art may be employed. Thevehicle 410 is also equipped with a rack-mountedcomputing apparatus 500, conceptually illustrated in FIG. 5. The computing apparatus includes aprocessor 505 communicating with somestorage 510 over abus system 515. Thestorage 510 may include a hard disk and/or RAM and/or removable storage such as the floppymagnetic disk 515 and theoptical disk 520. Thestorage 510 is encoded with adata structure 525 storing the three-dimensional data set acquired as discussed above. Thus, thestorage 510 is one implementation of the program storage medium 312 (shown in FIG. 3). - The
storage 510 is also encoded with anoperating system 530 and someuser interface software 535 that, in conjunction with thedisplay 540, constitute auser interface 545. Theuser interface 545 is one implementation of thecontroller interface 316 in FIG. 3. As previously noted, thedisplay 540 may be a touch screen allowing the user to input directly into the computing apparatus. However, theuser interface 545 may include peripheral I/O devices such as thekeyboard 550, themouse 555, or thejoystick 560, for use with other types of displays, e.g., a HUD. Theprocessor 505 runs under the control of theoperating system 530, which may be practically any operating system known to the art. Theprocessor 505, under the control of theoperating system 530, invokes theuser interface software 535 on startup so that the operator can control thecomputing apparatus 500. - The
storage 510 is also encoded with anapplication 565 invoked by theprocessor 505 under the control of theoperating system 530 or by the user through theuser interface 545. Theapplication 565, when executed by theprocessor 505, performs any processing or pre-processing on the three-dimensional data stored in thedata structure 525. Theapplication 565 also displays the three-dimensional image 450 of the field of view 430 (shown in FIG. 4), or a portion thereof, on thedisplay 540. The three-dimensional image 450 may be presented to a passenger of the controlledsystem 410 or to a remote user. The user then sights the weapons subsystem (not shown) by indicating asubset 460 of the three-dimensional image 450. The manner in which this designation occurs will be implementation specific depending upon the manner in which I/O is to occur, e.g., by touching a touch screen or designating with a mouse or joystick. - The invention admits wide variation in many aspects. For instance, the invention is not limited to ground-based or stationary controlled systems and the user may be local or remote relative to the controlled system. Consider, for instance, the
scenario 600 shown in FIG. 6, in which the controlled system is implemented as a flyingsubmunition 610 and the target is a movingship 620. The flyingsubmunition 610 transmitslaser pulses 630 that are reflected from theship 620. The three dimensional data is extracted from the returned signals 640, and either a three-dimensional data or a three-dimensional image is transmitted to a remote user aboard, for instance, anaircraft 650. This information is transmitted byelectromagnetic signals 660. Note that data pre-processing and processing usually occur where the three-dimensional image is generated, although this is not necessary to the practice of the invention. The three-dimensional image is then displayed to the remote user, who then makes the designation to sight the flyingsubmunition 610 on theship 620. The designation is transmitted by theelectromagnetic signals 670 to the flyingsubmunition 610. Upon receiving the designation, the flyingsubmunition 610 then targets theship 620 for destruction. Note that alternative scenarios might be ground-to-air or air-to-air scenarios. - FIG. 7 presents one
implementation 700 of thescenario 400 shown in FIG. 4. Theimplementation 700 is modified from a data acquisition and target identification process first shown in: - U.S. Pat. No. 5,644,386, entitled “Visual Recognition System for LADAR Sensors,” issued Jul. 1, 1997, to Loral Vought Systems Corp. as assignee of the inventors Gary Kim Jenkins, et al.
- This patent discloses a method by which targets are identified from three-dimensional images generated from three-dimensional data. The present invention employs a three-dimensional data set such as the one obtained by this prior art method to both sight and target the controlled system, i.e., a weapon platform in this scenario.
- In general, a LADAR system scans a target scene to provide on-site a three-dimensional image (representation) of the target scene. This image is processed to detect and segment potential targets. The segmentations representing these potential targets are then further processed by feature extracting and classification processes to identify the target. The segmentations of targets of interest previously completed prior to feature extraction and classification are either, or both, displayed locally or transmitted to a remote site for display. Because only the segmented targets rather than the entire scene is transmitted, this process allows communications over data links of limited bandwidth. A position within this segmented target image may then be sighted from this segmented three-dimensional image. A weapon system can then be targeted using the same three-dimensional data set.
- Referring now to FIG. 7, a
system 710 is shown for producing, processing, displaying, and transmitting images of one or more targets in atarget scene 712. Thesystem 710 includes avehicle 714 for housing aweapon platform 715, a transmitter andsensor platform 716 and aprocessing center 717. Theplatform 716 includes a conventional LADAR system that generates and directs a laser beam to scan thetarget scene 712, including the targets. Reflected laser light is detected by theplatform 716, and theprocessing center 717 processes the reflected light in a conventional manner into a scan data representative of thetarget scene 712. A Global Positioning System (“GPS”)transmitter 718 transmits a signal for providing accurate position data for thevehicle 714. - The
system 710 processes three-dimensional LADAR images of thetarget scene 712 in theprocessing center 717, manipulates the resulting data into packets of information containing a segment of the scene, and transmits these packets of information by a communications link 720 to aremote site 722 for display on adisplay 724. Theremote site 722 may be, for example, a combat platform or control and command node at a remote site and has access to data characterizing a local scene. - FIG. 8 depicts one embodiment of the
sensor platform 716 andprocessing platform 717 housed on thevehicle 714 in FIG. 7. Thesensor platform 716 supports aGPS receiver 830 disposed to receive asignal 833 from theGPS transmitter 718 in FIG. 7. Athermal imager 832 provides passive field of view search capability of thetarget scene 712 in FIG. 7. ALADAR sensor 834 generates a scan signal representative of an image of thetarget scene 712 in FIG. 7 by scanning a laser beam across thescene 712 and detecting reflections of the laser beam as it scans thescene 712. Although a number of different known LADAR arrangements may be employed, a suitable system is disclosed in: - U.S. Pat. No. 5,200,606, entitled “Laser Radar Scanning System”, filed Jul. 2, 1991, to LTV Missiles and Electronics Group as assignee of the inventors Nicholas J. Krasutsky et al.
- One suitable implementation for the
sensor 834 and associated pulse processing circuitry is disclosed in: - U.S. Pat. No. 5,243,553 entitled “Gate Array Pulse Capture Device” filed Jul. 2, 1991, to Loral Vought Systems Corp. as assignee of the inventor Stuart W. Flockencier.
- The
processing center 717 includes adigital processor 836 for processing the scan signal into three-dimensional LADAR images. Adata manipulator 838 is provided for manipulating selected three-dimensional LADAR image data into packets of information which may be displayed by alocal display terminal 840 at the vehicle 714 (shown in FIG. 7) for on-site display of thetarget 721. Adata transmitter 842 is also provided for transmitting the packets of segmented information over the limited bandwidth communications link 720 in FIG. 7. - In operation, the
vehicle 714 in FIG. 7 maneuvers into position to survey thetarget scene 712 in FIG. 7. The position of thevehicle 714 is read from theGPS receiver 830 housed on thesensor platform 716 in FIG. 7. Thetarget scene 712 in FIG. 7 is scanned with theLADAR sensor 834 which is aligned with a compass providing a north reference. TheLADAR sensor 834 collects data from the scannedtarget scene 712 in FIG. 7 and generates scan data representative of a three-dimensional image. Thedigital processor 836 ofprocessing center 717 further processes the scan data. This processing generally involves initially representing detected signals as data elements in a spherical coordinate system, wherein each data element includes a range value and an intensity value that correspond to a point on thetarget scene 712 in FIG. 7. - The
processor 836 then converts these data elements into a row-column format where the row-column represents the two angles in the spherical coordinate system and the data element is the range. In particular, thedigital processor 836 initially processes the scan data into a three-dimensional LADAR image according to a spherical coordinate system of some type, which has an origin that coincides with the location of the LADAR sensor's detecting optics. This may be performed in accordance with known techniques. - The spherical coordinate system is convenient in generating the three-dimensional images since the angular position of a point in the
target scene 712 in FIG. 7 may be measured with respect to axes that coincide with the axes of rotation of the LADAR sensor's detecting optics during scanning of thetarget scene 712 in FIG. 7. Moreover, the spherical coordinate system is conducive to storing the range of a point in thetarget scene 712 in FIG. 7, since this range corresponds to a radius from the LADAR sensor's detecting optics to the point. Each data element also includes an intensity value, representative of the intensity of the reflected light. Additionally, each data element includes an azimuth angle and an elevation angle. As indicated, this three-dimensional LADAR image is stored by theprocessor 836 in a row-column format for later use. - FIG. 9 illustrates the handling of the three-dimensional data set acquired as discussed immediately above. The LADAR three-dimensional data in row column format (at950) is further processed by the
digital processor 836 or, alternatively, by an off-site processor (not shown) such as a personal computer, a mini-computer, or other suitable computing device. This further processing generally involves preprocessing (at 952), detection (at 954), segmentation (at 956), feature extraction (at 958), and classification (at 960). - Generally, the preprocessing (at952) is directed to minimizing noise effects, such as identifying so-called intensity dropouts in the converted three-dimensional image, where the range value of the LADAR image is set to zero. Noise in the converted three-dimensional LADAR image introduced by low signal-to-noise ratio (“SNR”) conditions is processed so that performance of the overall system 10 is not degraded. In this regard, the converted LADAR image signal is used so that absolute range measurement distortion is minimized, edge preservation is maximized, and preservation of texture step (that results from actual structure in objects being imaged) is maximized.
- In general, detection (at954) identifies specific regions of interest in the three-dimensional LADAR image. The detection (at 954) uses range cluster scores as a measure to locate flat, vertical surfaces in an image. More specifically, a range cluster score is computed at each pixel to determine if the pixel lies on a flat, vertical surface. The flatness of a particular surface is determined by looking at how many pixels are within a given range in a small region of interest. The given range is defined by a threshold value that can be adjusted to vary performance. For example, if a computed range cluster score exceeds a specified threshold value, the corresponding pixel is marked as a detection. If a corresponding group of pixels meets a specified size criteria, the group of pixels is referred to as a region of interest. Regions of interest, for example those regions containing one or more targets, are determined and passed to a segmenter for further processing.
- Segmentation (at956) determines, for each detection of a target, which pixels in a region of interest belong to the detected target and which belong to the detected target's background. Segmentation (at 956) identifies possible targets, for example, those whose connected pixels exceed a height threshold above the ground plane. More specifically, the segmentation (at 956) separates target pixels from adjacent ground pixels and the pixels of nearby objects, such as bushes and trees.
- Feature extraction (at958) provides information about a segmentation (at 956) so that the target and its features in that segmentation can be classified. Features include, for example, orientation, length, width, height, radial features, turret features, and moments. The feature extraction (at 958) also typically compensates for errors resulting from segmentation (at 956) and other noise contamination. Feature extraction (at 958) generally determines a target's three-dimensional orientation and size and a target's size. The feature extraction (at 958) also distinguishes between targets and false alarms and between different classes of targets.
- Classification (at960) classifies segmentations to contain particular targets, usually in a two stage process. First, features such as length, width, height, height variance, height skew, height kurtosis, and radial measures are used to initially discard non-target segmentations. The segmentations that survive this step are then matched with true target data stored in a target database. The data in the target database, for example, may include length, width, height, average height, hull height, and turret height to classify a target. The classification (at 960) is performed using known methods for table look-ups and comparisons.
- Data obtained from the segmentation (at956), the feature extraction (at 958), and the classification (at 960) is assembled into a packet of information (not shown). The packet may be rapidly and accurately transmitted to the
remote site 722 in FIG. 7 and displayed in one of a variety of user-selectable formats. Typical formats include a three-view commonly used by armed forces to identify targets during combat, a north reference plan view, or a rotated perspective. These display options available to the operator, either local or remote, are based on the three-dimensional nature of the LADAR image. The results of the feature extraction (at 958) provide target information including orientation, length, width and height. The target image can be displayed from any perspective, independent of the sensor perspective, and the operator can select one of the several display formats that utilize the adjustable perspective. - Returning to FIG. 8, the data processed by the method900 in FIG. 9 is transmitted as described over the
communication link 720 to theremote site 722 and displayed on theremote display 724. Once the target image is displayed, the user can sight theweapon platform 715 on the target 721 (shown in FIG. 7) by indicating a point on the displayed image. Note that the sighting can also be performed in this manner on thelocal display 840, if desired. The sighted position is transmitted back to theprocessing center 717, whereupon thedigital processor 836 issues pointing commands to theweapon platform 715. Thecontrol subsystem 844 then implements the pointing commands to complete the targeting. - Although the invention is described above relative to military fire control systems, the invention is not so limited. The above described invention makes it possible in a number of military and civilian applications to integrate sighting and targeting activities more accurately than with the state of the art. For instance, robotic tools used in hazardous waste cleanup may be operated more safely and efficiently since the sighting and targeting the robotic tool (e.g., to a hazardous waste deposit needing cleanup) are from a common, three-dimensional data set. Other civilian applications such as law enforcement, data for robotics force-fighting, crime deterrence, and border patrol functions may also benefit from the application of the current invention. Thus, the present invention is not limited to applications found in a military context.
- Some portions of the detailed descriptions herein are presented in terms of a software implemented process involving symbolic representations of operations on data bits within a memory in a computing apparatus. These descriptions and representations are the means used by those in the art to most effectively convey the substance of their work to others skilled in the art. The process and operation require physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic, or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
- It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantifies. Unless specifically stated or otherwise as may be apparent, throughout the present disclosure, these descriptions refer to the action and processes of an electronic device, that manipulates and transforms data represented as physical (electronic, magnetic, or optical) quantities within some electronic device's storage into other data similarly represented as physical quantities within the storage, or in transmission or display devices. Exemplary of the terms denoting such a description are, without limitation, the terms “processing,” “computing,” “calculating,” “determining,” “displaying,” and the like.
- The following references are hereby incorporated by reference for their teachings regarding techniques and principles associated with cited subjects:
- data acquisition, pre-processing, processing, and display:
- U.S. Pat. No. 5,644,386, entitled “Visual Recognition System for LADAR Sensors,” issued Jul. 1, 1997, to Loral Vought Systems Corp. as assignee of the inventors Gary Kim Jenkins, et al.
- U.S. Pat. No. 5,424,823, entitled “System for Identifying Flat Orthogonal Objects Using Reflected Energy Signals”, issued Jun. 13, 1995, to Loral Vought Systems Corporation as assignee of the inventors James L. Nettles, et al.;
- for selected hardware useful in implementing the invention in certain embodiments, and in particular data acquisition:
- U.S. Pat. No. 5,200,606, entitled “Laser Radar Scanning System”, filed Jul. 2, 1991, to LTV Missiles and Electronics Group as assignee of the inventors Nicholas J. Krasutsky et al.;
- U.S. Pat. No. 5,243,553 entitled “Gate Array Pulse Capture Device” filed Jul. 2, 1991, to Loral Vought Systems Corp. as assignee of the inventor Stuart W. Flockencier.
- Each of these patents is commonly assigned herewith.
- This concludes the detailed description. The particular embodiments disclosed above are illustrative only, as the invention may be modified and practiced in different but equivalent manners apparent to those skilled in the art having the benefit of the teachings herein. Furthermore, no limitations are intended to the details of construction or design herein shown, other than as described in the claims below. It is therefore evident that the particular embodiments disclosed above may be altered or modified and all such variations are considered within the scope and spirit of the invention. Accordingly, the protection sought herein is as set forth in the claims below.
Claims (46)
1. A method, comprising:
sighting a position correlated to at least a subset of a three-dimensional data set representing a field of view; and
targeting a controlled system to the position from the three-dimensional data set.
2. The method of claim 1 , wherein the three-dimensional data comprises LADAR data.
3. The method of claim 1 , further comprising at least one of:
acquiring the three-dimensional data;
processing the three-dimensional data;
displaying a representation of the three-dimensional data;
displaying a projected target point after the controlled system is targeted; and
taking an action responsive to targeting the position.
4. The method of claim 3 , wherein acquiring the three-dimensional data includes:
transmitting a plurality of LADAR pulses; and
receiving the LADAR pulses after they are reflected.
5. The method of claim 3 , wherein processing the three-dimensional data includes generating a three-dimensional image from the three-dimensional data.
6. The method of claim 5 , wherein the three-dimensional image is the representation.
7. The method of claim 5 , wherein generating the three-dimensional image includes:
pre-processing the three-dimensional data;
detecting a target represented by a subset of the three-dimensional data;
segmenting the subset from the remainder of the three-dimensional data;
extracting features of the target from the segmented data; and
classifying the segmented subset as including a particular kind of target based on the extracted features.
8. The method of claim 1 , wherein sighting the position indicating a portion of a displayed image generated from the three-dimensional data.
9. The method of claim 8 , wherein targeting the controlled system includes aiming a weapon system at the sighted position.
10. The method of claim 1 , wherein targeting the controlled system includes aiming a weapon system at the sighted position.
11. An apparatus, comprising:
a program storage medium capable of storing a three-dimensional data set representing a field of view;
a controller capable of generating a presentation of the three-dimensional data set;
a controller interface through which a position represented by at least a subset of the three-dimensional data can be sighted and through which the position can be targeted from the subset.
12. The apparatus of claim 11 , wherein the program storage medium comprises a magnetic program storage medium or an optical program storage medium.
13. The apparatus of claim 11 , wherein the magnetic program storage medium comprises a floppy disk, a zip disk, or a hard disk.
14. The apparatus of claim 12 , wherein the optical program storage medium comprises an optical disk.
15. The apparatus of claim 11 , wherein the controller comprises a digital processor.
16. The apparatus of claim 15 , wherein the digital processor is a microprocessor or a digital signal processor.
17. The apparatus of claim 11 , wherein the controller interface includes a display.
18. The apparatus of claim 17 , wherein the display is a helmet-mounted display or a rack-mounted display.
19. The apparatus of claim 11 , wherein the display includes a touch screen.
20. The apparatus of claim 17 , wherein the controller interface includes at least one peripheral input/output device.
21. A controlled system, comprising:
a data acquisition system capable of acquiring a three-dimensional data set representing a field of view;
a sighting and targeting subsystem, including:
a program storage medium capable of storing the three-dimensional data set;
a controller capable of generating a presentation of the three-dimensional data set; and
a controller interface through which a position represented by at least a subset of the three-dimensional data can be sighted and through which the position can be targeted from a presentation of the subset;
a control subsystem capable of implementing instructions from the sighting and targeting subsystem.
22. The controlled system of claim 21 , wherein the data acquisition system includes a LADAR system.
23. The controlled system of claim 21 , wherein the LADAR system comprises a direct diode LADAR system.
24. The controlled system of claim 21 , wherein the control subsystem comprises a weapon pointing system.
25. A method, comprising:
acquiring a three-dimensional data set representing the content of a field of view;
generating a three-dimensional representation of the content from the three-dimensional data set;
displaying the three-dimensional representation;
sighting a position within the field of view from the three-dimensional representation; and
targeting the sighted position using the three-dimensional data set.
26. The method of claim 25 , wherein acquiring the three-dimensional data set includes:
transmitting a plurality of light pulses; and
receiving a plurality of the transmitted light pulses upon their reflection by an object in the field of view.
27. The method of claim 26 , further comprising:
extracting the three-dimensional data from the received light pulses; and
storing the received light pulses in a row column format.
28. The method of claim 25 , wherein generating the three-dimensional representation includes:
detecting a region of interest in the three-dimensional image;
segmenting a target in the region of interest from the three-dimensional image;
extracting features of the segmented target; and
classifying the target from the extracted features.
29. The method of claim 25 , further comprising pre-processing the three-dimensional data.
30. The method of claim 25 , further comprising transmitting the generated three-dimensional image to a remote location before displaying the three-dimensional image.
31. An apparatus, comprising:
means for sighting a position correlated to at least a subset of a three-dimensional data set representing a field of view; and
means for targeting a controlled system to the position from the three-dimensional data set.
32. The apparatus of claim 31 , wherein the three-dimensional data comprises LADAR data.
33. The apparatus of claim 31 , further comprising at least one of:
means for acquiring the three-dimensional data;
means for processing the three-dimensional data;
means for displaying a representation of the three-dimensional data;
means for displaying a projected target point after the controlled system is targeted; and
means for taking an action responsive to targeting the position.
34. The apparatus of claim 31 , wherein targeting the controlled system includes aiming a weapon system at the sighted position.
35. An apparatus, comprising:
means for storing a three-dimensional data set representing a field of view;
means for generating a presentation of the three-dimensional data set;
means for sighting a position represented by at least a subset of the three-dimensional data and for targeting the position from the subset.
36. The apparatus of claim 35 , wherein the storing means comprises a magnetic program storage medium or an optical program storage medium.
37. The apparatus of claim 35 , wherein the generating means comprises a digital processor.
38. The apparatus of claim 35 , wherein the sighting and targeting means includes a display.
39. The apparatus of claim 21 , wherein the program storage medium comprises a magnetic program storage medium or an optical program storage medium.
40. The apparatus of claim 21 , wherein the magnetic program storage medium comprises a floppy disk, a zip disk, or a hard disk.
41. The apparatus of claim 21 , wherein the controller comprises a digital processor.
42. The apparatus of claim 21 , wherein the controller interface includes a display.
43. The apparatus of claim 21 , wherein the display includes a touch screen.
44. The method of claim 25 , wherein sighting the position indicating a portion of a displayed image generated from the three-dimensional data.
45. The method of claim 25 , wherein targeting the controlled system includes aiming a weapon system at the sighted position.
46. The method of claim 25 , wherein targeting the controlled system includes aiming a weapon system at the sighted position.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/060,565 US20030140775A1 (en) | 2002-01-30 | 2002-01-30 | Method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/060,565 US20030140775A1 (en) | 2002-01-30 | 2002-01-30 | Method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set |
Publications (1)
Publication Number | Publication Date |
---|---|
US20030140775A1 true US20030140775A1 (en) | 2003-07-31 |
Family
ID=27610023
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/060,565 Abandoned US20030140775A1 (en) | 2002-01-30 | 2002-01-30 | Method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set |
Country Status (1)
Country | Link |
---|---|
US (1) | US20030140775A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030136253A1 (en) * | 2002-01-24 | 2003-07-24 | Hans Moser | Combat vehicle having an observation system |
US20070208459A1 (en) * | 2006-03-03 | 2007-09-06 | Samsung Techwin Co., Ltd. | Sentry robot |
US20070219720A1 (en) * | 2006-03-16 | 2007-09-20 | The Gray Insurance Company | Navigation and control system for autonomous vehicles |
US8245623B2 (en) | 2010-12-07 | 2012-08-21 | Bae Systems Controls Inc. | Weapons system and targeting method |
US20130002525A1 (en) * | 2011-06-29 | 2013-01-03 | Bobby Duane Foote | System for locating a position of an object |
DE102011106810A1 (en) * | 2011-07-07 | 2013-01-10 | Testo Ag | Handheld type thermal imaging camera has image analyzing and processing unit that is provided to perform image analyzing and image processing for selected picture area |
WO2013144502A1 (en) | 2012-03-29 | 2013-10-03 | Nexter Systems | Method for acquiring the coordinates of a triggering point of a projectile and fire control implementing such a method |
US20150025788A1 (en) * | 2011-12-20 | 2015-01-22 | Sadar 3D, Inc. | Systems, apparatus, and methods for acquisition and use of image data |
US9121946B2 (en) * | 2013-09-13 | 2015-09-01 | Fu Tai Hua Industry (Shenzhen) Co. Ltd. | Automobile with anti-collision function and anti-collision method |
US9766042B2 (en) * | 2015-10-26 | 2017-09-19 | Huntercraft Limited | Integrated precise photoelectric sighting system |
US11441874B2 (en) * | 2017-11-10 | 2022-09-13 | Hanwha Defense Co., Ltd. | Remote weapon control device and method for targeting and shooting multiple objects |
CN116468797A (en) * | 2023-03-09 | 2023-07-21 | 北京航天众信科技有限公司 | Aiming method and device for rail-mounted robot and computer equipment |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4267562A (en) * | 1977-10-18 | 1981-05-12 | The United States Of America As Represented By The Secretary Of The Army | Method of autonomous target acquisition |
US5448936A (en) * | 1994-08-23 | 1995-09-12 | Hughes Aircraft Company | Destruction of underwater objects |
US5644386A (en) * | 1995-01-11 | 1997-07-01 | Loral Vought Systems Corp. | Visual recognition system for LADAR sensors |
US5682229A (en) * | 1995-04-14 | 1997-10-28 | Schwartz Electro-Optics, Inc. | Laser range camera |
US5868675A (en) * | 1989-10-05 | 1999-02-09 | Elekta Igs S.A. | Interactive system for local intervention inside a nonhumogeneous structure |
US5877851A (en) * | 1997-09-24 | 1999-03-02 | The United States Of America As Represented By The Secretary Of The Army | Scannerless ladar architecture employing focal plane detector arrays and FM-CW ranging theory |
US20020113865A1 (en) * | 1997-09-02 | 2002-08-22 | Kotaro Yano | Image processing method and apparatus |
US20020149628A1 (en) * | 2000-12-22 | 2002-10-17 | Smith Jeffrey C. | Positioning an item in three dimensions via a graphical representation |
US6580429B1 (en) * | 2000-01-25 | 2003-06-17 | Mitsubishi Denki Kabushiki Kaisha | Method of generating data for three-dimensional graphic recording medium and three-dimensional graphic system |
US6662036B2 (en) * | 1991-01-28 | 2003-12-09 | Sherwood Services Ag | Surgical positioning system |
-
2002
- 2002-01-30 US US10/060,565 patent/US20030140775A1/en not_active Abandoned
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4267562A (en) * | 1977-10-18 | 1981-05-12 | The United States Of America As Represented By The Secretary Of The Army | Method of autonomous target acquisition |
US5868675A (en) * | 1989-10-05 | 1999-02-09 | Elekta Igs S.A. | Interactive system for local intervention inside a nonhumogeneous structure |
US6662036B2 (en) * | 1991-01-28 | 2003-12-09 | Sherwood Services Ag | Surgical positioning system |
US5448936A (en) * | 1994-08-23 | 1995-09-12 | Hughes Aircraft Company | Destruction of underwater objects |
US5644386A (en) * | 1995-01-11 | 1997-07-01 | Loral Vought Systems Corp. | Visual recognition system for LADAR sensors |
US5682229A (en) * | 1995-04-14 | 1997-10-28 | Schwartz Electro-Optics, Inc. | Laser range camera |
US20020113865A1 (en) * | 1997-09-02 | 2002-08-22 | Kotaro Yano | Image processing method and apparatus |
US5877851A (en) * | 1997-09-24 | 1999-03-02 | The United States Of America As Represented By The Secretary Of The Army | Scannerless ladar architecture employing focal plane detector arrays and FM-CW ranging theory |
US6580429B1 (en) * | 2000-01-25 | 2003-06-17 | Mitsubishi Denki Kabushiki Kaisha | Method of generating data for three-dimensional graphic recording medium and three-dimensional graphic system |
US20020149628A1 (en) * | 2000-12-22 | 2002-10-17 | Smith Jeffrey C. | Positioning an item in three dimensions via a graphical representation |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030136253A1 (en) * | 2002-01-24 | 2003-07-24 | Hans Moser | Combat vehicle having an observation system |
US7032495B2 (en) * | 2002-01-24 | 2006-04-25 | Rheinmetall Landsysteme Gmbh | Combat vehicle having an observation system |
US20070208459A1 (en) * | 2006-03-03 | 2007-09-06 | Samsung Techwin Co., Ltd. | Sentry robot |
US20070219720A1 (en) * | 2006-03-16 | 2007-09-20 | The Gray Insurance Company | Navigation and control system for autonomous vehicles |
US8050863B2 (en) * | 2006-03-16 | 2011-11-01 | Gray & Company, Inc. | Navigation and control system for autonomous vehicles |
US8346480B2 (en) | 2006-03-16 | 2013-01-01 | Gray & Company, Inc. | Navigation and control system for autonomous vehicles |
US8245623B2 (en) | 2010-12-07 | 2012-08-21 | Bae Systems Controls Inc. | Weapons system and targeting method |
US20130002525A1 (en) * | 2011-06-29 | 2013-01-03 | Bobby Duane Foote | System for locating a position of an object |
DE102011106810A1 (en) * | 2011-07-07 | 2013-01-10 | Testo Ag | Handheld type thermal imaging camera has image analyzing and processing unit that is provided to perform image analyzing and image processing for selected picture area |
DE102011106810B4 (en) * | 2011-07-07 | 2016-08-11 | Testo Ag | Thermal imaging camera and method for image analysis and / or image processing of an IR image with a thermal imaging camera |
US20150025788A1 (en) * | 2011-12-20 | 2015-01-22 | Sadar 3D, Inc. | Systems, apparatus, and methods for acquisition and use of image data |
WO2013144502A1 (en) | 2012-03-29 | 2013-10-03 | Nexter Systems | Method for acquiring the coordinates of a triggering point of a projectile and fire control implementing such a method |
FR2988859A1 (en) * | 2012-03-29 | 2013-10-04 | Nexter Systems | METHOD FOR ACQUIRING THE COORDINATES OF A PROJECTILE TRIGGER POINT AND TIR CONDUIT USING SUCH A METHOD |
US9488443B2 (en) | 2012-03-29 | 2016-11-08 | Nexter Systems | Method for acquiring the coordinates of a triggering point of a projectile and fire control implementing such a method |
AU2013239535B2 (en) * | 2012-03-29 | 2017-08-31 | Nexter Systems | Method for acquiring the coordinates of a triggering point of a projectile and fire control implementing such a method |
US9121946B2 (en) * | 2013-09-13 | 2015-09-01 | Fu Tai Hua Industry (Shenzhen) Co. Ltd. | Automobile with anti-collision function and anti-collision method |
US9766042B2 (en) * | 2015-10-26 | 2017-09-19 | Huntercraft Limited | Integrated precise photoelectric sighting system |
US11441874B2 (en) * | 2017-11-10 | 2022-09-13 | Hanwha Defense Co., Ltd. | Remote weapon control device and method for targeting and shooting multiple objects |
CN116468797A (en) * | 2023-03-09 | 2023-07-21 | 北京航天众信科技有限公司 | Aiming method and device for rail-mounted robot and computer equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090228159A1 (en) | Dual fov imaging semi-active laser system | |
JP3345113B2 (en) | Target object recognition method and target identification method | |
US7719664B1 (en) | Imaging semi-active laser system | |
Bhanu et al. | Image understanding research for automatic target recognition | |
US20090268030A1 (en) | Integrated video surveillance and cell phone tracking system | |
US5644386A (en) | Visual recognition system for LADAR sensors | |
US7870816B1 (en) | Continuous alignment system for fire control | |
US5267329A (en) | Process for automatically detecting and locating a target from a plurality of two dimensional images | |
US20030140775A1 (en) | Method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set | |
US10408574B2 (en) | Compact laser and geolocating targeting system | |
US20100002910A1 (en) | Method and Apparatus for Developing Synthetic Three-Dimensional Models from Imagery | |
US5341435A (en) | System for detection and recognition of an object by video imaging means | |
CN112068111A (en) | Unmanned aerial vehicle target detection method based on multi-sensor information fusion | |
EP0173534A2 (en) | Data processing arrangements | |
US20070127008A1 (en) | Passive-optical locator | |
US8103056B2 (en) | Method for target geo-referencing using video analytics | |
Sheu et al. | Dual-axis rotary platform with UAV image recognition and tracking | |
US8958654B1 (en) | Method and apparatus for enhancing three-dimensional imagery data | |
US10240900B2 (en) | Systems and methods for acquiring and launching and guiding missiles to multiple targets | |
US20110181722A1 (en) | Target identification method for a weapon system | |
US20220392098A1 (en) | Target classification system | |
KR20240021670A (en) | Method for monitoring unidentified object using an artificial intelligence model learned based on the type of unidentified objects | |
Hughes et al. | Advances in automatic electro-optical tracking systems | |
JPH0798218A (en) | Menace identification device | |
RU2789117C2 (en) | Three-coordinate device for detection and recognition of objects with monocular optoelectronic devices of on-ground and aerial robotic complexes based on stereoscopic 3d monitoring |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LOCKHEED MARTIN CORPORATION, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:STEWART, JOHN R.;REEL/FRAME:012552/0697 Effective date: 20020124 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |