WO2013110072A1 - Surface feature detection by radiation analysis - Google Patents

Surface feature detection by radiation analysis Download PDF

Info

Publication number
WO2013110072A1
WO2013110072A1 PCT/US2013/022556 US2013022556W WO2013110072A1 WO 2013110072 A1 WO2013110072 A1 WO 2013110072A1 US 2013022556 W US2013022556 W US 2013022556W WO 2013110072 A1 WO2013110072 A1 WO 2013110072A1
Authority
WO
WIPO (PCT)
Prior art keywords
radiation
feature
data
computing device
detection
Prior art date
Application number
PCT/US2013/022556
Other languages
French (fr)
Inventor
Keith W. CUNNINGHAM
Donald K. ATWOOD
Original Assignee
University Of Alaska
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University Of Alaska filed Critical University Of Alaska
Publication of WO2013110072A1 publication Critical patent/WO2013110072A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N2021/4704Angular selective
    • G01N2021/4709Backscatter
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N2021/4735Solid samples, e.g. paper, glass

Definitions

  • radiation data based on radiation backscattered from a surface can be received.
  • a feature of the surface can be identified in the radiation data.
  • Optical imaging data associated with the surface can be received.
  • a portion of the optical imaging data comprising the feature can be identified.
  • a geographic location can be associated with the feature.
  • exemplary systems can comprise a radiation detection device configured to receive radiation backscattered from a surface.
  • exemplary systems can also comprise an optical detection device configured to receive optical imaging data associated with the surface.
  • exemplary systems can further comprise a computing device configured to: receive radiation data from the radiation detection device, and the radiation data can be based on the radiation backscattered from the surface; identify in the radiation data a feature of the surface; receive the optical imaging data associated with the surface from the optical detection device; identify a portion of the optical imaging data comprising the feature; and associate a geographic location with the feature.
  • Figures 1A and IB are illustrations of an exemplary system for feature detection
  • Figure 2 is a diagram illustrating an exemplary system for feature detection
  • Figure 3 is a plot illustrating exemplary radiation data
  • Figure 4 is plot illustrating radiation data of an example concrete sidewalk
  • Figures 5 A and 5B are plots illustrating radiation data of exemplary asphalt roads
  • FIGS. 6 A and 6B illustrate radiation data collected from an aircraft
  • Figure 7 is a flowchart illustrating an exemplary method for feature detection
  • Figure 8 illustrates a flowchart of an exemplary method for monitoring a solid surface via close-range SAR in accordance with one or more aspects of the disclosure.
  • Figure 9 illustrates a block diagram of an exemplary operating environment having a computing device that enables various features of the disclosure and performance of the various methods disclosed herein.
  • Ranges may be expressed herein as from “about” one particular value, and/or to "about” another particular value. When such a range is expressed, another embodiment includes from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent "about,” it will be understood that the particular value forms another embodiment. It will be further understood that the endpoints of each of the ranges are significant both in relation to the other endpoint, and independently of the other endpoint.
  • a unit may be, but is not limited to being, a process running on a processor, a processor, an object, an executable computer program, a thread of execution, a program, a memory (e.g., a hard disc drive), and/or a computer.
  • a unit can be an apparatus with specific functionality provided by mechanical parts operated by electric or electronic circuitry which is operated by a software or a firmware application executed by a processor, wherein the processor can be internal or external to the apparatus and executes at least a part of the software or firmware application.
  • a unit can provide specific functionality based on physical structure or specific arrangement of hardware elements.
  • a unit can be an apparatus that provides specific functionality through electronic functional elements without mechanical parts, the electronic functional elements can include a processor therein to execute software or firmware that provides at least in part the functionality of the electronic functional elements.
  • An illustration of such apparatus can be control circuitry, such as a programmable logic controller.
  • the subject disclosure relates to monitoring and inspection of surfaces, such as solid surfaces, with remote sensing technologies integrated with position sensors to geolocate (e.g., to associate a data record with a location record) the remotely sensed digital data sets.
  • the remote sensing technologies can comprise one or more of radiation detection units, such as radio detection and ranging (RADAR), and optical detection devices such as laser profilers and optical cameras.
  • the solid surfaces can comprise road pavement comprising asphalt, concrete, and the like.
  • One or more digital data sets indicative of at least one image of a probed surface can be output to a computer device (e.g., user equipment) for analysis, such as maintenance planning and the like.
  • Radiation detection devices such as RADAR devices, as employed in the disclosure can be utilized to image and characterize road and bridge pavement, for example, to assess and monitor pavement condition, cracking, and/or roughness.
  • radiation imaging of a solid surface as described herein, can provide suitable data for locating and characterizing the condition and degree of cracking of the solid surface (e.g., pavement).
  • the disclosed radiation data detection can augment traditional camera imaging of solid surfaces, such as pavement.
  • the disclosed radiation detection devices when used in conjunction with video imaging techniques, for example, can permit automated detection of pavement cracking and conditions speeding the manual inspection and analysis of video imagery.
  • the disclosure can be utilized by pavement engineers, pavement imaging companies, city and county engineers, and state and federal departments of transportation.
  • Exemplary salient features or aspect of the disclosure can comprise the following: (1) radiation detection devices can collect data at night, day, and through fog; (2) radiation detection devices can automatically detect pavement joints and cracking orthogonal to the radar line of sight, being an effective tool to automatically data, such as features on the surfaces, collected from optical imaging systems such as cameras and light detection and ranging devices; (3) three dimensional models of the feature and/or surface can be assembled from the radiation data; (4) temporal measurements are also enabled, particularly through interferometry, allowing the automated change detection of the feature, (5) small-scale (or minute) pavement features, including pavement cracking, are readily discernable with radiation detection devices; and/or (6) surface roughness can be more easily quantified.
  • FIG. 1 A is a side-view illustrating an exemplary system 102 for feature detection.
  • FIG. IB is a top-down view illustrating the exemplary system 102 for feature detection.
  • the system 102 can comprise a transport device 104 configured to move the systems.
  • the transport device 104 can be a truck, van, car, and the like for traveling along a roadway 116.
  • the transport device 104 can comprise an aircraft, as shown in FIG. 2.
  • the system 102 can also comprise a location device 106 configured to receive location information.
  • the location device 106 can comprise a global positioning systems (GPS) device configured to receive geographic coordinates from one or more satellite devices.
  • GPS global positioning systems
  • the location device 106 can comprise an inertial reference system configured to provide coordinates based on a previous location.
  • the system 102 can comprise an optical detection device 108 configured to receive optical imaging data associated with a surface 110.
  • the optical detection device 108 can comprise an imaging device, such as a camera.
  • the optical detection device 108 can comprise a light detection and ranging device (LIDAR).
  • LIDAR light detection and ranging device
  • the LIDAR device can be configured to detect distance of the feature from the LIDAR device and spatial characteristics of the feature.
  • the optical detection device 108 can comprise a laser emitter and laser detector.
  • the system 102 can comprise a radiation detection device 112 configured to receive radiation backscattered from the surface.
  • the radiation detection device 112 can be configured to provide radiation to the surface. Additionally, at least a portion of the radiation can yield the radiation backscattered from the surface in response to interacting with the surface.
  • the surface can comprise one or more features 114.
  • the features can be morphological features, such as a cracks, holes, and
  • the features can be located, for example, on a roadway 116, a sidewalk 118, a parking lot, a driveway, a rooftop, an aircraft runway, a bridge, and the like.
  • the radiation detection device 112 can comprise a first positioning device 120.
  • the first positioning device 120 can position a first radiation source 122 and first radiation sensor 124 at an angle 126 to the surface 110.
  • the first positioning device 120 can comprise one or more antennae configured to emit and receive electromagnetic waves.
  • the radiation detection device 112 can comprise a second positioning device 121 for a second radiation source 123 and a second radiation sensor 125.
  • the first radiation source 122 can be oriented along the direction of travel 127
  • the second radiation source 123 can be oriented perpendicular to the direction of travel 127.
  • the first radiation source 122 can detect features oriented perpendicular to the direction of travel 127
  • the second radiation source 123 can detect features oriented parallel to the direction of travel 127.
  • the radiation detection device 112 can comprise additional radiation sources oriented in various directions. Additionally, in some configurations, the radiation detection device 112 can be limited to either the first radiation source 122 or the second radiation source 123.
  • the radiation sources 122 and 123 can emit electromagnetic radiation, such as electromagnetic waves.
  • the radiation sources 122 and 123 can emit radio waves (e.g., electromagnetic radiation with frequencies between 4 kHz to 300 GHz).
  • the radiation sensors 124 and 125 can detect the radiation from the radiation sources 122 and 123 after the radiation is reflected (e.g., backscattered) back to the radiation sensor 124.
  • the radiation detection device 112 can comprise one or more RADAR devices.
  • the radiation detection device 112 can comprise a miniaturized RADAR device.
  • the radiation detection device 112 can comprise a signal processor configured to process signals received at the radiation sensors 124 and 125.
  • the radiation detection device 112 can comprise an input/output interface for communicating with other devices, such as the optical imaging device 108, location device 106, a computing device 128, and/or the like.
  • the radiation detection device 112 can comprise memory for storing the radiation data. Additionally, the radiation data can be provided to and stored at the computer device 128 or other location.
  • system 102 can comprise a computing device 128
  • the computing device 128 can be configured to receive radiation data from the radiation detection device 112. For example, the radiation data can be based on the radiation backscattered from the surface 110.
  • the computing device 128 can be configured to identify in the radiation data a feature 114 of the surface 110.
  • the computing device 128 can be configured to identify from the radiation data a signal strength value above a threshold value.
  • the signal strength value can indicate the amount of radiation backscattered from one or more locations on the surface.
  • computing device can be configured to identify a pattern of the signal strength values over time.
  • the computing device 128 can further be configured to receive the optical imaging data associated with the surface 110 from the optical detection device 108.
  • the computing device 128 can be configured to receive distance of the feature 114 from the light detection and ranging device and spatial characteristics of the feature 114 from the light detection and ranging device.
  • the computing device 128 can also be configured to identify a portion of the optical imaging data comprising the feature 114.
  • the computing device 128 can be configured to associate a geographic location with the feature.
  • the computing device 128 can be configured to receive geographic data from the location device 106 (e.g., at least one of the geographic positioning system device and the inertial reference system device).
  • the computing device 128 can be further configured to provide a representation of the feature based on the optical imaging data and the associated geographic location.
  • the representation of the feature 114 comprises a three dimensional computer generated model.
  • FIG. 2 is a diagram illustrating an exemplary system 202 for feature detection.
  • the system 202 can comprise a transport device 204.
  • the transport device 204 can comprise an aircraft.
  • the aircraft can comprise an unmanned aerial system.
  • the transport device 204 can comprise a ground vehicle, such as a car, truck, van, and the like.
  • the transport device 204 can be coupled to measuring devices, such as the optical detection device 108, location device 106, and radiation detection device 112 described herein.
  • One or more of the measuring devices can be oriented at a look-angle 206.
  • the look angle 206 can determine a portion 208 of a surface measured by the measuring devices.
  • the radiation detection device 112 can emit radiation (e.g., electromagnetic waves) at the look angle 206 and receive reflected radiation at the look angle 206.
  • the height 210 e.g., altitude
  • the velocity of the transport device 204, the geographic location of the transport device 204, and other measurements can be used to determine the geographic location of one or more features on the surface.
  • the radiation detection device 112 can collect backscattered radiation off a surface, and computer-executable instructions can permit data processing and related feature extraction.
  • software can be configured (e.g., programmed) to discriminate signal from noise and generate imaging data indicative of imagery of a monitored solid surface.
  • the software can be configured (e.g., programmed or coded) according to one or more various methods of the disclosure. Such methods can be grouped into two broad categories: discrimination, and classification.
  • the discrimination process can broadly encompass processing, filtering, analysis, and identification of features.
  • preprocessing steps can occur in real-time or near-real-time.
  • the raw radiation data, optical imaging data, and the geocoding information can be retained for later processing and analysis.
  • Classification algorithms can evaluate the radiation data for several broad categories of data, such as surface texture, surface cracking, and other features on the surface that are not intrinsically part of the surface.
  • Surface texture can comprise roughness, grooving, and texture.
  • Surface texture data for example, can provide
  • the surface structure can broadly comprise the joints found in pavement intentionally placed to permit the expansion and contraction of pavement surfaces as well as the undesirable cracking of pavement that occurs due to wear and tear of traffic and weather. Cracking can take on a multitude of forms because the damage can occur in many different ways.
  • the third category of classification can comprise identifying features on the surface of pavement that are not part of the pavement structure, such as gravel, stray items that are lost from passing traffic, materials blown onto road surfaces from weather, and other unexpected items. Both the discrimination and classification methods can be based on mathematical and statistical filters derived from machine learning and expert rales.
  • the radiation data can augment data received from optical imaging devices, such as optical and laser sensors.
  • One or more aspects of the disclosure can discriminate—e.g., output data that can permit identification of specific features— concrete pavement from asphalt pavement.
  • the joints and cracks in pavement being identified can be smaller than the theoretical resolution of the radar imaging sensors.
  • two types of theoretical resolution can be contemplated. The first includes the real resolution of the radar sensor determined by the antenna size and frequency at which it operates (e.g., L-band, X-band, and Ku-band).
  • the second form of operation, using a synthetic aperture can support higher resolution than that of real aperture, in which the motion of the vehicle is used to synthesize a larger effective antenna size with resultant enhancement of resolution.
  • a radiation detection device emitting electromagnetic waves of a frequency of 16 cm an example L-band frequency
  • the radiation detection device 112 can be configured as an active remote sensing technology. Similar to the light emissions from a laser profiler, the radar detection device 112 can emit energy in the form of electromagnetic waves, albeit at a different wavelength than the optical detection device 108. Both sensors can operate as a radiation source and thereby illuminate a target surface. In addition, the radiation detection device 112 can detect reflected radiation (or backscatter) and data indicative of strength of the radiation. The radiation detection device 112 can also record an amount of backscatter. In one aspect, at least one feature of the collected data can serve as a fingerprint, or can be indicative, of a specific morphological feature (e.g., a crack or a boundary, such as joint) of the monitored surface. Such fingerprint provided by the data indicative of the reflected energy can be utilized to characterize the received signal and to create an image.
  • a specific morphological feature e.g., a crack or a boundary, such as joint
  • both the optical detection device 108 and radiation detection device 112 can emit many pulses of energy per second to create an image.
  • pulses can be transmitted at rates of the order of KHz.
  • high-precision clocks on the optical detection device 108 and radiation detection device 112 can record the time for each pulse of energy to return.
  • Such information can be employed to calculate a distance or range to the feature reflecting the energy. Accordingly, these technologies can be referred to in manners that include the term "ranging”: Light Detection and Ranging (LIDAR) and Radio Detection and Ranging (RADAR).
  • LIDAR Light Detection and Ranging
  • RADAR Radio Detection and Ranging
  • solid surfaces that can be probed can comprise asphalt pavement and concrete pavement in a variety of conditions of such surfaces.
  • the radiation detection device 112 can be configured to emit radio waves in the Ky- band, X-band, and/or other electromagnetic radiation bands.
  • the strength of the returned radiation (e.g., visible radiation, microwave radiation) signal can depend on the characteristics of the imaged target. For example, certain surfaces can reflect laser and microwave energy very well, whereas other surfaces can scatter energy in a manner that does not return to the sensor and thus reduces the returned signal. In general, when laser or microwave energy impinges onto a smooth surface at an angle, the energy can be reflected away; whereas a rough surface can diffusively scatter the energy in various directions, returning a portion of the incident energy back to a sensor (e.g., the first radiation sensor 124 and/or second radiation sensor 125) that is part or functionally coupled to the radiation detection device 112.
  • the radiation detection device 112 also can comprise or be functionally coupled to a RADAR source (e.g., a source of radio waves, such as electromagnetic radiation having wavelengths ranging from about 1 cm to about lm).
  • Energy scattered away from the radiation detection device 112 typically results in less returned energy and, thus, detected signals with lower strength.
  • a low- strength signal can be represented as dark feature in an image.
  • a rougher surface can scatter microwaves and, thus, can create a region of gray in the image.
  • the brightest features in an image of a probed surface can occur in a scenario in which the radiation incident on the probed surface is reflected directly back to the sensor.
  • roughness of the probed surface can be recorded as various shades of gray in an image.
  • the look angle e.g., the angle at which the radar sensor images a surface, such as look angle 126 or look angle 206 can determine how the energy is scattered and reflected back (or backscattered) and, thus, how a resulting image can be formed.
  • the resulting image can be referred to as a microwave image.
  • RADAR 's microwave energy obliquely striking the surface e.g., a paved surface
  • a radiation detection device 112 can be mounted to the transportation device 104 (e.g., in the bed of a truck) pointed backwards and oriented at an angle (e.g., 45 degrees) towards the pavement.
  • Such configuration is referred to as the "real aperture” because the "aperture” is the size of the antenna which creates a footprint on the ground that is determined by diffraction theory.
  • the first positioning device 120, first radiation source 122, and first radiation sensor 124 can be configured for detection based on a real aperture.
  • the radiation detection device 112 can be positioned to point down to the side of the vehicle, such that the radiation source is oriented perpendicular to its direction of travel.
  • Such configuration which is referred to as "synthetic aperture radar” can permit utilization of the vehicle's movement to collect sequential sets of radiation data of a probed solid surface. These sequential images can then be synthesized into a single image that achieves the high resolution (e.g., cm-range) associated with a very large "synthesized" aperture.
  • the second positioning device 121, second radiation source 123, and second radiation sensor 125 can be configured for detection based on a synthetic aperture.
  • a radiation detection device 112 configured with a synthesized aperture can detect minute cracking oriented orthogonal to the radiation detection unit 112 side-look angle (e.g., look angle 206).
  • the theoretical resolution of a radiation detection device 112 configured to emit radiation in the Ku-band e.g., from about 12 GHz to about 18 GHz
  • the theoretical resolution of a radiation detection device 112 configured to emit radiation in the Ku-band can be coarser than the cracks detected and such cracks can be clearly visible in imagery produced from collected imaging data (e.g., strength ofback scattered radiation).
  • a radiation detection device 112 configured with a synthetic aperture can operate from an aircraft flown at, for example, 1,000 feet above ground level.
  • a radiation detection device 112 configured in the real aperture configuration can detect linear cracking that can be oriented orthogonal to the radar angle of view (e.g., look angle 126).
  • an exemplary radiation detection device 112 in the real aperture configuration can be designed to detect cracks as a transport device 104 carrying the detection equipment moves over a solid surface (e.g., pavement) being probed. Radiation data collected in this configuration can be plotted as a "waterfall" of pavement data continuously moving.
  • a feature, such as a crack can appear as a smear through time, with the image having time on one axis and the crack "signature" on the other axis (see, for example, FIGs. 3-5).
  • a "signature" can represent
  • GPS Global Positioning System
  • geocoding the radiation data such as data indicative of the signal strength of backscattered radiation, with other sensor data (data acquired by a video camera, data generated by a LIDAR profiler, etc.) can comprise several parameters.
  • a first set of parameters of the parameters can be the xyz
  • Another set of three parameters can comprises orientation parameters from the inertial reference system, which can provide the orientation of the radar within the coordinate space (see, e.g., FIG. 2).
  • the orientation parameters can comprise a set of three-dimensional orientation angles describing the look angle of the radar system (see, e.g., FIG. 2).
  • Yet another parameter of the seven parameters can be a time stamp (e.g., a time record) obtained, for example, from the GPS.
  • the time stamp can be applied to all other data sets, including but not limited to optical imaging data (e.g., video imagery and LIDAR profiler data), thereby allowing the integration of the complete pavement imaging solution.
  • the system 102 can collect a substantive amount of data. For example, amounts data of the order of a few gigabytes per minute can be common for the radiation detection device 112. In one aspect, such data can be rich and complex, particularly, yet not exclusively, because a portion of the data can be associated with roughness of a probed surface (e.g., pavement) that do not present cracking or other morphological anomaly or feature that is intended to be detected. In certain implementations, one or more methods can be implemented to analyze the collected data, identifying and/or removing contributions to the data arising from such roughness (referred to as "roughness data"), and yielding information associated directly with cracking. Radiation data collected in the real aperture configuration can be utilized to detect surface roughness. Such integration of real aperture based radiation detection can provide a very useful sensor in an integrated pavement management system, the sensor reducing manual inspection time, and other human intervention time, since the cracks can be pre-detected automatically.
  • an exemplary radiation detection device 112 can comprise a RADAR device weighing 3.5 pounds and with an antenna is seven inches long. Such RADAR can consume several Watts to operate. Other RADAR units having less weight than 3.5 pounds also can be utilized.
  • RADAR units embodied on one or more integrated circuits that include antenna(s) can be utilized.
  • a radiation detection device 112 e.g., comprising a RADAR
  • a radiation detection device 112 can be mounted in the bed of a truck on a wooden scaffold about five feet above the ground.
  • the antenna can be pointed at a 45 degree angle towards the pavement to create an oblique view from the back of the vehicle.
  • the radiation detection device 112 can comprise a RADAR unit manufactured by ImSAR or Artemis.
  • the optical detection device can comprise a laser profiler (or LIDAR profiler) pointed at the solid surface (e.g., pavement) that can permit characterizing the morphology of the solid surface.
  • a laser profiler or LIDAR profiler pointed at the solid surface (e.g., pavement) that can permit characterizing the morphology of the solid surface.
  • the location device 106 can comprise a GPS for xyz coordinate geolocation augmented with an inertial reference system (IRS) that can provides xyz coordinate information when a GPS fix is unavailable.
  • IRS inertial reference system
  • Radiation data obtained with the radiation detection device 112, optical imaging data obtained from the optical detection device 108, location data obtained from the location device 106, and other data can be output to a user device, e.g., a device of a pavement engineer, for analysis of imagery associated with the data.
  • the radiation detection device 112 can be pointed ahead, behind, or to the side of the transportation device 104 so as to form an angle (e.g., look angle 126) allowing the electromagnetic energy to concentrate in the solid surface and morphological features thereof, such as pavement cracks, and be reflected back (or back scattered) to an antenna in the radiation detection device 112.
  • the optical detection device 108 can comprise a video camera pointed down towards the surface (e.g., the pavement) and a LIDAR profiler.
  • the location device 106 can comprise a GPS and IRS device. The location device 106 can provide the data to geo-locate all or a portion of the collected imaging data, and to synchronize, using time and coordinate stamps, for example, one or more data sets having such data.
  • the radiation detection device 112 can be mounted on the front or back of the transportation device 104, oriented at an angle (e.g., look angle 126) down towards the solid surface (e.g., pavement).
  • the radiation detection device 112 can emit electromagnetic pulses, for example, that are scattered and reflected back to the radiation detection device 112. Because of the way the electromagnetic pulses are concentrated and reflected back to the radiation detection device 112, such stronger returns can be indicative of the location of a specific morphology feature (e.g., a crack) of the solid surface.
  • Information associated with signal strength of the returned radar pulses can form a fingerprint indicative of roughness of the solid surface (e.g., pavement).
  • the system 102 can geo-locate the crack location and surface roughness with the other data being collected, which can comprise the video imagery and LIDAR profile data.
  • the radiation detection device 112 can provide a triggering mechanism that indicates the presence of a crack or anomaly in the pavement, flagging the integrated GPS/IRS, video imagery, and LIDAR profile data for follow up investigation by the pavement engineer.
  • One exemplary operation of the disclosure can comprise the radiation detection device 112 performing a first inspection of the solid surface (e.g., a road) and flagging video imagery obtained with the optical detection device 108 (e.g., with the video camera and LIDAR-based imaging data).
  • Such flagging step, or identification step can simplify inspection and interpretation of conditions of the surface, such as pavement, by a field engineer (e.g., a pavement engineer).
  • a conventional survey vehicle suitable (e.g., designed) for pavement imaging and profiling can be configured or fitted with the radiation detection device 112 by adding, for example, the radiation detection device 112, in either a forward or backward looking direction, in a manner that at least a portion of the solid surface (e.g., a lane of pavement) in which the survey vehicle moves also is being imaged with the radiation detection device 112 and other components of the disclosure.
  • the radiation detection device 112 in either a forward or backward looking direction, in a manner that at least a portion of the solid surface (e.g., a lane of pavement) in which the survey vehicle moves also is being imaged with the radiation detection device 112 and other components of the disclosure.
  • FIG. 3 is a plot illustrating exemplary radiation data.
  • FIG, 3 is a waterfall plot showing the received signal from a radiation detection unit (in this case, a RADAR) with distance traveled in the y axis and time traveled on the x axis.
  • the plot illustrates radiation data collected as the transportation device moves past two well-defined metallic features.
  • the background color (dark gray) represents random noise of the pavement and the speckled pattern is caused by the roughness of the pavement.
  • the lighter tones of grey indicate returned signals: one for each of the metal features.
  • Such type of graph can be described as a "waterfall" plot because the plot tracks the signal through time and over distance. Return signals of the strength represented by the metal features can be detected with automated signal processing software.
  • methods for analyzing the radiation data can utilize a matched filter approach to extract the signal from the clutter (or noise).
  • These return signals detected in the radiation data can be geo-referenced, or geocoded, by linking coordinate information from the global positioning system and a backup inertial navigation system.
  • cracks and other features can be geo-referenced, or geocoded.
  • a concrete sidewalk with well-defined joints can be probed or monitored.
  • Driving the transport vehicle over the sidewalk can provide radiation data indicating clear return signals as features in a waterfall plot.
  • FIG, 4 is another plot illustrating radiation data of an example concrete sidewalk.
  • FIG. 4A is an example concrete sidewalk
  • FIG. 4B is a waterfall plot illustrating cracks in concrete sidewalk.
  • Concrete sidewalk cracks can be clearly observed in the plot as a succession of lines at the lower range scale (e.g., less than 30). Collected data and associated analysis demonstrate that real aperture based detection can be accomplished at close range.
  • detection of the joints and cracking on concrete roadways can detect perpendicular line of sight where the radar antenna is oriented.
  • alternative antennae configuration(s) and/or orientation(s) can detect cracking in other perpendicular directions.
  • switched-beam electronically scanned arrays on monolithic microwave integrated circuits can be or can comprise a radiation detection device configured to operate with antenna sweeping.
  • two or more radiation detection devices can emit radiation in orthogonal directions to detect features oriented in different directions.
  • FIGs. 5A and SB are plots illustrating radiation data of exemplary asphalt roads.
  • the results of the asphalt imaging test can be best characterized by comparing two scenes: one for a pristine asphalt road with no discernible cracks as shown in FIG. 5 A, and one for a road with a maze of "spider" cracks as shown in FIG. SB.
  • cracking detected in radiation data for asphalt can be more subtle. In one aspect, cracks of about 2 centimeters and larger can be identified.
  • the substantive volume of data from the RADAR sensor can be processed according to methodology(ies) described herein.
  • Such techniques can be suitably modified so that the radiation signal returns can be classified and geocoded to show the crack information in real time with the other data sets from the imagery and LIDAR sensors.
  • Such an approach can utilize statistical filters and artificial neural networks to classify features based on the morphology and importance of the feature for later examination by pavement engineers.
  • the radiation detection device can automatically detect the presence and characteristics of the pavement cracking, and then integrate this information with data from other sensors such as camera and laser pavement imaging/profiling systems.
  • data can comprise data and/or metadata indicative of images requiring inspection based on the detected cracks.
  • the data and/or metadata can indicate (e.g., flag) such images.
  • FIGs. 6A and 6B illustrate radiation data collected from an aircraft.
  • FIG. 6A is an exemplary image of a surface received by an optical imaging device.
  • FIG. 6B shows exemplary radiation data overlaid on the image.
  • the exemplary radiation data can be received from a synthetic aperture radar (SAR).
  • SAR synthetic aperture radar
  • a system configured on a small aircraft, such as a Cessna 172, can be employed to collect data. Theoretically, the operation of the SAR from an aircraft can yield the same or similar imaging results as from a truck. In addition to waterfall plots, richer high-resolution images can be produced from the data collected by the aircraft.
  • SAR can easily see sidewalk cracks in concrete orientated orthogonal to the antenna orientation.
  • the SAR image was detected by an aircraft flying along a direction parallel to the sidewalk cracks detected.
  • cracks that run parallel to the flight line can be easily observed, whereas those sidewalk cracks running perpendicular to the flight direction may be more difficult to detect.
  • cracking was detected as return signals when passing over concrete cracks.
  • Such approach includes data processing method(s) for removing signal noise that can mask fingerprint(s) of cracks.
  • real aperture based detection can yield a signal that can be traced through time and distance on a waterfall plot.
  • SAR can generate a real image that appears similar to an air photograph. Such forms of detection can be geo-referenced to permit integration of other pavement imaging technologies with real aperture and synthetic aperture technologies.
  • the image can indicate cracks or seams in the surface.
  • the regular pattern of white lines can illustrate seams 602 or other lines such as cracks detected on the sidewalk.
  • One or more methods for analyzing imaging data obtained according to aspects of the disclosure can process (e.g., manage, operate, and/or the like) the substantive size of the datasets, decimate data, filter/reduce/eliminate noise from signal, classify signal as false positive and true positive, extract true positive, and then attempt to classify true positive signals.
  • Airborne experiments of the disclosure demonstrate that pavement inspection can be performed from transportation devices other than terrestrial vehicles.
  • a miniaturized SAR or other radiation detection device can be operated from unmanned aerial systems flying at lower altitudes. It should be appreciated that airborne SAR can be performed on platforms in deep space (e.g., satellites) and high flying jets, for example.
  • the close-range SAR of the disclosure, and related embodiments of system and methods can synthesize the aperture at very short distances, low altitudes, and still travel enough distance on the platform to so synthesize the aperture.
  • exemplary methods disclosed throughout the subject specification can be stored on an article of manufacture, or computer-readable medium, to facilitate transporting and transferring such methods to a computing device (e.g., a desktop computer, a mobile computer, a mobile telephone, a blade computer, a programmable logic controller, and the like) for execution, and thus a computing device (e.g., a desktop computer, a mobile computer, a mobile telephone, a blade computer, a programmable logic controller, and the like) for execution, and thus
  • a computing device e.g., a desktop computer, a mobile computer, a mobile telephone, a blade computer, a programmable logic controller, and the like
  • FIG. 7 is a flowchart illustrating an exemplary method for feature detection.
  • radiation can be provided to the surface.
  • an antenna can emit electromagnetic radiation.
  • at least a portion of the radiation can yield the radiation backscattered from a surface in response to interacting with the surface.
  • the surface can comprise a surface of at least one of a road, a parking lot, a driveway, a rooftop, an aircraft runway, a bridge, a sidewalk, and the like.
  • radiation data can be received based on radiation backscattered from a surface.
  • a feature of the surface can be identified in the radiation data.
  • the feature can comprise at least one of a crack, a hole, an indentation in the surface, and the like.
  • a signal strength value above a threshold value can be identified from the radiation data.
  • the signal strength value can indicate the amount of radiation backscattered from one or more locations on the surface.
  • the feature can reflect a greater amount of radiation back to a radiation detection device than areas of the surface surrounding the figure.
  • a pattern of the signal strength values over time can be identified. An example pattern can comprise a series of signal strengths above a threshold value.
  • the threshold value can comprise a specified signal strength.
  • optical imaging data associated with the surface can be received.
  • data from a light detection and ranging device can be received.
  • data from the light detection and ranging device can comprise distance of the feature from the light detection and ranging device and/or spatial characteristics of the feature.
  • the spatial characteristics can comprise height, length, depth, shape, edge information, and the like.
  • a portion of the optical imaging data comprising the feature can be identified.
  • a geographic location can be associated with the feature.
  • geographic data can be received from at least one of a geographic positioning system device and an inertial reference system device.
  • the geographic data can be associated with a time and/or with other data, such as radiation data and optical imaging data.
  • a representation of the feature based on the optical imaging data and the associated geographic location can be provided.
  • the representation of the feature can comprise a three dimensional computer generated model.
  • the three dimensional model can be based on, for example, data from a LIDAR device and/or data from a radiation detection device.
  • the representation can comprise a two dimensional image, such as a photograph of the feature.
  • FIG. 8 illustrates a flowchart of an exemplary method 800 for monitoring a solid surface via close-range SAR in accordance with one or more aspects of the disclosure.
  • the exemplary system 102 described in connection with FIG. 1 can carry out the exemplary method 800.
  • backscattered radiation from a solid surface can be collected at a radiation detection device (e.g., RADAR).
  • Block 802 can be referred to as the collecting backscattered radiation step and can comprise delivering radiation onto the surface, at least a portion of the radiation yielding the backscattered radiation in response to interacting with the surface.
  • the collecting backscattered radiation step can comprise generating a time series of data records associated with signal strength of the backscattered radiation.
  • At block 804 at least one feature associated with morphology of the solid surface can be extracted, by a computing device (e.g., computing device 901) from data yielded from the collected backscattered radiation.
  • Block 804 can be referred to as an extracting step.
  • the extracting step can comprise analyzing the time series (e.g., the radiation data over time, example illustrations of such are found in the waterfall plots of FIGs. 3-5) and, in response, identifying a signal level above a threshold.
  • optical imaging data associated with the solid surface can be collected at an optical detection device (e.g., a video camera, a CCD camera, a LDDAR unit, etc.).
  • the at least one feature can be associated, by the computing device, with at least a portion of the optical imaging data associated with the solid surface.
  • the at least one feature with location data can be geocoded, by the computing device. Block 810 can be referred to as the geo coding step.
  • the geocoding step can comprise associating the at least one feature with the location data, the location data comprising one or more of global positioning system based data or inertial reference system based data.
  • the exemplary method 800 also can comprise associating, by the computing device, the at least one feature with at least a portion of the optical imaging data associated with the solid surface.
  • the disclosure can be advantageous over conventional technologies to various service sectors associated with transportation.
  • pavement engineers, pavement imaging companies, city and county engineers, and state and federal departments of transportation can benefit from one or more of the aspects or features of the disclosure by achieving efficiencies not available through conventional technologies.
  • the disclosure also can have other economic implications, such as better utilization of taxes resulting from reduced costs of pavement inspection and better pavement management strategies.
  • Private sector and public sector organizations that utilize roads can benefit from the invention by being able to implement more adequate or efficient logistics with better pavement management systems.
  • aspects of the disclosure contemplate use of electromagnetic waves at substantively close ranges - closer than theoretical "range gates" that suggest the minimum distance the sensor can be from the feature being imaged.
  • conventional SAR solutions can be mounted on satellites and aircraft.
  • the close-range SAR can accomplish radar imaging while mounted on a terrestrial vehicle, such as a trailer towed behind a truck.
  • the disclosed radiation detection attained in a system mounted on a terrestrial vehicle can comprise mapping of radiation data to optical imaging data and/or time stamps.
  • FIG. 9 illustrates a block diagram of an exemplary operating environment 900 having a computing device 901 that enables various features of the disclosure and performance of the various methods disclosed herein.
  • Computing device 901 can embody a data processing unit that can be part of the radiation detection device or can be functionally coupled thereto.
  • This exemplary operating environment 900 is only an example of an operating environment and is not intended to suggest any limitation as to the scope of use or functionality of operating environment architecture. Neither should the exemplary operating environment 900 be interpreted as having any dependency or requirement relating to any one or combination of functional elements (e.g., units, components, adapters, or the like) illustrated in such exemplary operating environment.
  • the various embodimen ts of the disclosure can be operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well-known computing systems, environments, and/or configurations that can be suitable for use with the systems and methods of the disclosure comprise personal computers, server computers, laptop devices or handheld devices, and multiprocessor systems. Additional examples comprise mobile devices, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that comprise any of the above systems or devices, and the like.
  • the processing effected in the disclosed systems and methods can be performed by software components.
  • the disclosed systems and methods can be described in the general context of computer-executable instructions, such as program modules, being executed by one or more computers, such as computing device 901, or other computing devices.
  • program modules comprise computer code, routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the disclosed methods also can be practiced in grid-based and distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules can be located in both local and remote computer storage media including memory storage devices.
  • the systems and methods disclosed herein can be implemented via a general-purpose computing device in the form of a computing device 901.
  • the components of the computer 901 can comprise, but are not limited to, one or more processors 903, or processing units 903, a system memory 912, and a system bus 913 that couples various system components including the processor 903 to the system memory 912. in the case of multiple processing units 903, the system can utilize parallel computing.
  • a processor 903 or a processing unit 903 refers to any computing processing unit or processing device comprising, but not limited to, single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory.
  • a processor 903 or processing unit 703 can refer to an integrated circuit, an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein.
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • PLC programmable logic controller
  • CPLD complex programmable logic device
  • Processors or processing units referred to herein can exploit nano-scale architectures such as, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of the computing devices that can implement the various aspects of the subject disclosure.
  • Processor 903 or processing unit 903 also can be implemented as a combination of computing processing units.
  • the system bus 913 represents one or more of several possible types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures.
  • bus architectures can comprise an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, an Accelerated Graphics Port (AGP) bus, and a Peripheral Component Interconnects (PCI), a PCI-Express bus, a Personal Computer Memory Card Industry Association (PCMCIA), Universal Serial Bus (USB) and the like.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • AGP Accelerated Graphics Port
  • PCI Peripheral Component Interconnects
  • PCI-Express PCI-Express
  • PCMCIA Personal Computer Memory Card Industry Association
  • USB Universal Serial Bus
  • the bus 913, and all buses specified in this specification and annexed drawings also can be implemented over a wired or wireless network connection and each of the subsystems, including the processor 903, a mass storage device 904, an operating system 90S, feature detection software 906, feature detection data 907, a network adapter 908, system memory 912, an Input/Output Interface 910, a display adapter 909, a display device 911, and a human machine interface 902, can be contained within one or more remote computing devices 914a,b,c at physically separate locations, functionally coupled (e.g., communicatively coupled) through buses of this form, in effect implementing a fully distributed system.
  • each of the subsystems including the processor 903, a mass storage device 904, an operating system 90S, feature detection software 906, feature detection data 907, a network adapter 908, system memory 912, an Input/Output Interface 910, a display adapter 909, a display device 911, and a human machine interface 902
  • each of the subsystems
  • Feature detection software 906 can configure the computing device 901, or a processor thereof, to perform one or more of imaging data collection, analysis of at least a portion of such data for feature extraction associated with morphological features of a probed solid surface, and geo coding of at least a portion of the imaging data and location data in accordance with aspects of the disclosure.
  • Feature detection software 906 can be retained in a memory as a group of computer-accessible instructions, e.g., computer- readable instructions, computer-executable instructions, or computer-readable computer- executable instructions.
  • the group of computer-accessible instructions can encode one or more methods of the disclosure.
  • the group of computer- accessible instructions can encode various formalisms for feature extraction, such as wavelet analysis, autonomous classification, or the like.
  • Certain implementations of feature detection software 906 can include a compiled instance of such computer-accessible instructions, a linked instance of such computer-accessible instructions, a compiled and linked instance of such computer-executable instructions, or an otherwise executable instance of the group of computer-accessible instructions.
  • Feature detection data 907 can comprise various types of data that can permit implementation (e.g., compilation, linking, execution, and combinations thereof) of the feature detection software 906.
  • feature detection data 907 can comprise imaging data described herein, such as data available for waterfall plots, and data structures containing information associated with imaging data, location data, and geocoding.
  • the computing device 901 typically comprises a variety of computer readable media. Exemplary readable media can be any available media that is accessible by the computer 901 and comprises, for example and not meant to be limiting, both volatile and non-volatile media, removable and non-removable media.
  • the system memory 912 comprises computer readable media in the form of volatile memory, such as random access memory (RAM), and/or non-volatile memory, such as read only memory (ROM).
  • the system memory 912 typically contains data (such as a group of tokens employed for code buffers) and/or program modules such as operating system 905 and feature detection software 906 that are immediately accessible to and/or are presently operated on by the processing unit 903.
  • Operating system 905 can comprise OSs such as Windows operating system, Unix, Linux, Symbian, Android, iOS, Chromium, and substantially any operating system for wireless computing devices or tethered computing devices.
  • computing device 901 can comprise other removable/nonremovable, volatile/non-volatile computer storage media.
  • computing device 901 comprises a mass storage device 904 which can provide non-volatile storage of computer code (e.g., computer-executable instructions), computer-readable instructions, data structures, program modules, and other data for the computing device 901.
  • a mass storage device 904 can be a hard disk, a removable magnetic disk, a removable optical disk, magnetic cassettes or other magnetic storage devices, flash memory cards, CD-ROM, digital versatile disks (DVD) or other optical storage, random access memories (RAM), read only memories (ROM), electrically erasable programmable readonly memory (EEPROM), and the like.
  • any number of program modules can be stored on the mass storage device 904, including by way of example, an operating system 905, and feature detection software 906.
  • Each of the operating system 905 and feature detection software 906 (or some combination thereof) can comprise elements of the programming and the feature detection software 906.
  • Data and code e.g., computer-executable instruction(s)
  • Feature detection software 906, and related data and code can be stored in any of one or more databases known in the art. Examples of such databases comprise, DB2 ® , Microsoft ® Access, Microsoft ® SQL Server, Oracle ® , mySQL, PostgreSQL, and the like. Further examples include membase databases and flat file databases. The databases can be centralized or distributed across multiple systems.
  • a user can enter commands and information into the computing device 901 via an input device (not shown).
  • input devices comprise, but are not limited to, a camera; a keyboard; a pointing device (e.g., a "mouse”); a microphone; a joystick; a scanner (e.g., barcode scanner); a reader device such as a
  • radiofrequency identification (RFID) readers or magnetic stripe readers RFID readers or magnetic stripe readers
  • gesture-based input devices such as tactile input devices (e.g., touch screens, gloves and other body coverings or wearable devices), speech recognition devices, or natural interfaces; and the like.
  • RFID radiofrequency identification
  • gesture-based input devices such as tactile input devices (e.g., touch screens, gloves and other body coverings or wearable devices), speech recognition devices, or natural interfaces; and the like.
  • These and other input devices can be connected to the processing unit 903 via a human machine interface 902 that is coupled to the system bus 913, but can be connected by other interface and bus structures, such as a parallel port, game port, an IEEE 1394 Port (also known as a Firewire port), a serial port, or a universal serial bus (USB).
  • a parallel port e.g., game port, an IEEE 1394 Port (also known as a Firewire port), a serial port, or a universal serial bus (USB).
  • USB universal serial bus
  • a display device 911 also can be functionally coupled to the system bus 913 via an interface, such as a display adapter 909. It is contemplated that the computer 901 can have more than one display adapter 909 and the computer 901 can have more than one display device 911.
  • a display device can be a monitor, an LCD (Liquid Crystal Display), or a projector.
  • other output peripheral devices can comprise components such as speakers (not shown) and a printer (not shown) which can be connected to the computer 901 via Input/Output Interface 910. Any step and/or result of the methods can be output in any form to an output device.
  • Such output can be any form of visual representation, including, but not limited to, textual, graphical, animation, audio, tactile, and the like.
  • one or more sensor(s) 918 can be functionally coupled to the system bus 913 through an I/O interface of the one or more I/O interface ⁇ ) 910.
  • the sensor(s) can comprise radiation detection devices (e.g., RADAR) and/or optical detection devices (e.g., LIDAR, camera system).
  • the one or more camera(s) can be functionally coupled to other functional elements of the computing device.
  • the I/O interface, at least a portion of the system bus 913, and system memory 912 can embody a data collection unit that can permit receiving data acquired by at least one of the one or more sensor(s) 918.
  • Such data collection unit can be an analog unit or a unit for collection of digital data, or a combination thereof.
  • processor 903 can provide analog-to-digital functionality and decoder functionality, and the I/O interface can include circuitry to collect the analog signal received from at least one sensor of the one or more sensor(s) 918.
  • the computing device 901 can operate in a networked environment (e.g., an industrial environment) using logical connections to one or more remote computing devices 914a,b,c, and equipment 916.
  • a remote computing device can be a personal computer, portable computer, a mobile telephone, a server, a router, a network computer, a peer device or other common network node, and so on.
  • Logical connections between the computer 901 and a remote computing device 914a,b,c can be made via a local area network (LAN) and a general wide area network (WAN).
  • LAN local area network
  • WAN wide area network
  • a network adapter 908 can be implemented in both wired and wireless environments.
  • networking environments can be conventional and commonplace in offices, enterprise-wide computer networks, intranets.
  • the networking environments generally can be embodied in wireline networks or wireless networks (e.g., cellular networks, such as Third Generation (3G) and Fourth Generation (4G) cellular networks, facility-based networks (femtocell, picocell, Wi-Fi networks, etc.).
  • a group of one or more network(s) 915 can provide such networking environments.
  • the remote computing devices 914a,b,c can embody additional sensor(s), such as inertial guidance system(s).
  • equipment 916 can comprise various part of the exemplary system illustrated in FIG. 1.
  • equipment 916 also can comprise an inertial guidance system.
  • Computer-readable media can comprise “computer storage media,” or “computer-readable storage media,” and “communications media.”
  • “Computer storage media” comprise volatile and non-volatile, removable and non-removable media implemented in any methods or technology for storage of information such as computer readable instructions, data structures, program modules, or other data.
  • Exemplary computer storage media comprises, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer.
  • the systems and methods of the subject disclosure for management and recovery of a monetary instrument can employ artificial intelligence (AI) techniques such as machine learning and iterative learning.
  • AI artificial intelligence
  • techniques include, but are not limited to, expert systems, case based reasoning, Bayesian networks, behavior based AI, neural networks, fuzzy systems, evolutionary computation (e.g., genetic algorithms), swarm intelligence (e.g., ant algorithms), and hybrid intelligent systems (e.g., Expert inference rules generated through a neural network or production rules from statistical learning).

Abstract

Provided are systems and methods for feature detection. Exemplary methods for can comprise receiving radiation data based on radiation backscattered from a surface. Exemplary methods can also comprise identifying, in the radiation data, a feature of the surface. Exemplary methods can further comprise receiving optical imaging data associated with the surface. Exemplary methods can also comprise identifying a portion of the optical imaging data comprising the feature. Exemplary methods can comprise associating a geographic location with the feature.

Description

SURFACE FEATURE DETECTION BY RADIATION ANALYSIS
CROSS REFERENCE TO RELATED PATENT APPLICATION
[0001] This application claims priority to U.S. Provisional Application No. 61/589,247 filed January 20. 2012, herein incorporated by reference in its entirety.
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH
[0002] This invention was made with government support under Cooperative
Agreement No. RITARS-11 -H-UAF awarded by the Department of Transportation, Research and Innovative Technology Association. The government has certain rights in the invention.
SUMMARY
[0003] It is to be understood that both the following general description and the following detailed description are exemplary and explanatory only and are not restrictive, as claimed. Provided are methods and systems for feature detection. In one aspect of exemplary methods, radiation data based on radiation backscattered from a surface can be received. A feature of the surface can be identified in the radiation data. Optical imaging data associated with the surface can be received. A portion of the optical imaging data comprising the feature can be identified. Additionally, a geographic location can be associated with the feature.
[0004] In one aspect, exemplary systems can comprise a radiation detection device configured to receive radiation backscattered from a surface. Exemplary systems can also comprise an optical detection device configured to receive optical imaging data associated with the surface. Exemplary systems can further comprise a computing device configured to: receive radiation data from the radiation detection device, and the radiation data can be based on the radiation backscattered from the surface; identify in the radiation data a feature of the surface; receive the optical imaging data associated with the surface from the optical detection device; identify a portion of the optical imaging data comprising the feature; and associate a geographic location with the feature.
[0005] Additional advantages will be set forth in part in the description which follows or may be l earned by practice. The advantages will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments and together with the description, serve to explain the principles of the methods and systems:
Figures 1A and IB are illustrations of an exemplary system for feature detection;
Figure 2 is a diagram illustrating an exemplary system for feature detection;
Figure 3 is a plot illustrating exemplary radiation data;
Figure 4 is plot illustrating radiation data of an example concrete sidewalk;
Figures 5 A and 5B are plots illustrating radiation data of exemplary asphalt roads;
Figures 6 A and 6B illustrate radiation data collected from an aircraft;
Figure 7 is a flowchart illustrating an exemplary method for feature detection;
Figure 8 illustrates a flowchart of an exemplary method for monitoring a solid surface via close-range SAR in accordance with one or more aspects of the disclosure; and
Figure 9 illustrates a block diagram of an exemplary operating environment having a computing device that enables various features of the disclosure and performance of the various methods disclosed herein.
DETAILED DESCRIPTION
[0007] Before the present articles, devices, and/or methods are disclosed and described, it is to be understood that the subject disclosure is not limited to the specific systems and methods for feature detection described herein. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting.
[0008] As used in the specification and the appended claims, the singular forms "a," "an," and "the" include plural referents unless the context clearly dictates otherwise
[0009] Ranges may be expressed herein as from "about" one particular value, and/or to "about" another particular value. When such a range is expressed, another embodiment includes from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent "about," it will be understood that the particular value forms another embodiment. It will be further understood that the endpoints of each of the ranges are significant both in relation to the other endpoint, and independently of the other endpoint.
[0010} In the subject specification and in the claims which follow, reference may be made to a number of terms which shall be defined to have the following meanings:
"Optional" or "optionally" means that the subsequently described event or circumstance may or may not occur, and that the description includes instances where said event or circumstance occurs and instances where it does not.
[0011] As employed in this specification and annexed drawings, the terms "unit," "component," "interface," "system," "platform," and the like are intended to include a computer-related entity or an entity related to an operational apparatus with one or more specific functionalities, wherein the computer-rel ated entity or the entity related to the operational apparatus can be either hardware, a combination of hardware and software, software, or software in execution. One or more of such entities are also referred to as "functional elements." As an example, a unit may be, but is not limited to being, a process running on a processor, a processor, an object, an executable computer program, a thread of execution, a program, a memory (e.g., a hard disc drive), and/or a computer. As another example, a unit can be an apparatus with specific functionality provided by mechanical parts operated by electric or electronic circuitry which is operated by a software or a firmware application executed by a processor, wherein the processor can be internal or external to the apparatus and executes at least a part of the software or firmware application. In addition or in the alternative, a unit can provide specific functionality based on physical structure or specific arrangement of hardware elements. As yet another example, a unit can be an apparatus that provides specific functionality through electronic functional elements without mechanical parts, the electronic functional elements can include a processor therein to execute software or firmware that provides at least in part the functionality of the electronic functional elements. An illustration of such apparatus can be control circuitry, such as a programmable logic controller. The foregoing example and related illustrations are but a few examples and are not intended to be limiting. Moreover, while such illustrations are presented for a unit, the foregoing examples also apply to a component, a system, a platform, and the like. It is noted that in certain embodiments, or in connection with certain aspects or features thereof, the terms "unit," "component," "system,"
"interface," "platform" can be utilized interchangeably.
[0012] Throughout the description and claims of this specification, the word "comprise" and variations of the word, such as "comprising" and "comprises," means "including but not limited to," and is not intended to exclude, for example, other additives, components, integers or steps. "Exemplary" means "an example of and is not intended to convey an indication of a preferred or ideal embodiment. "Such as" is not used in a restrictive sense, but for explanatory purposes.
10013] Reference will now be made in detail to the various embodiment(s), aspects, and features of the subject disclosure, example(s) of which are illustrated in the
accompanying drawings. Aspects, features, or advantages of the subject disclosure will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the subject disclosure. The advantages of the subject disclosure will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims. It is to be understood that both the description herein is exemplary and explanatory only and are not restrictive of the disclosure.
[0014] Conventional visual inspection of pavement involves a trained pavement engineer to visually inspect every frame of video imagery to detect and then analyze pavement cracking and surface conditions. As described in greater detail below, the subject disclosure relates to monitoring and inspection of surfaces, such as solid surfaces, with remote sensing technologies integrated with position sensors to geolocate (e.g., to associate a data record with a location record) the remotely sensed digital data sets. The remote sensing technologies can comprise one or more of radiation detection units, such as radio detection and ranging (RADAR), and optical detection devices such as laser profilers and optical cameras. The solid surfaces can comprise road pavement comprising asphalt, concrete, and the like. One or more digital data sets indicative of at least one image of a probed surface can be output to a computer device (e.g., user equipment) for analysis, such as maintenance planning and the like.
[0015] Radiation detection devices, such as RADAR devices, as employed in the disclosure can be utilized to image and characterize road and bridge pavement, for example, to assess and monitor pavement condition, cracking, and/or roughness. In one aspect, radiation imaging of a solid surface, as described herein, can provide suitable data for locating and characterizing the condition and degree of cracking of the solid surface (e.g., pavement). In another aspect, the disclosed radiation data detection can augment traditional camera imaging of solid surfaces, such as pavement. In yet another aspect, the disclosed radiation detection devices, when used in conjunction with video imaging techniques, for example, can permit automated detection of pavement cracking and conditions speeding the manual inspection and analysis of video imagery. In certain implementations, the disclosure can be utilized by pavement engineers, pavement imaging companies, city and county engineers, and state and federal departments of transportation.
[0016] Exemplary salient features or aspect of the disclosure can comprise the following: (1) radiation detection devices can collect data at night, day, and through fog; (2) radiation detection devices can automatically detect pavement joints and cracking orthogonal to the radar line of sight, being an effective tool to automatically data, such as features on the surfaces, collected from optical imaging systems such as cameras and light detection and ranging devices; (3) three dimensional models of the feature and/or surface can be assembled from the radiation data; (4) temporal measurements are also enabled, particularly through interferometry, allowing the automated change detection of the feature, (5) small-scale (or minute) pavement features, including pavement cracking, are readily discernable with radiation detection devices; and/or (6) surface roughness can be more easily quantified.
[0017] FIG. 1 A is a side-view illustrating an exemplary system 102 for feature detection. Additionally, FIG. IB is a top-down view illustrating the exemplary system 102 for feature detection. In one aspect, the system 102 can comprise a transport device 104 configured to move the systems. For example, the transport device 104 can be a truck, van, car, and the like for traveling along a roadway 116. In another aspect, the transport device 104 can comprise an aircraft, as shown in FIG. 2. The system 102 can also comprise a location device 106 configured to receive location information. In one aspect, the location device 106 can comprise a global positioning systems (GPS) device configured to receive geographic coordinates from one or more satellite devices. In another aspect, the location device 106 can comprise an inertial reference system configured to provide coordinates based on a previous location.
[0018] In one aspect, the system 102 can comprise an optical detection device 108 configured to receive optical imaging data associated with a surface 110. The optical detection device 108 can comprise an imaging device, such as a camera. Additionally, the optical detection device 108 can comprise a light detection and ranging device (LIDAR). The LIDAR device can be configured to detect distance of the feature from the LIDAR device and spatial characteristics of the feature. For example, the optical detection device 108 can comprise a laser emitter and laser detector. In another aspect, the system 102 can comprise a radiation detection device 112 configured to receive radiation backscattered from the surface. For example, the radiation detection device 112 can be configured to provide radiation to the surface. Additionally, at least a portion of the radiation can yield the radiation backscattered from the surface in response to interacting with the surface.
[0019] As shown in FIG. IB, the surface can comprise one or more features 114. For example, the features can be morphological features, such as a cracks, holes, and
indentations, and the like. The features can be located, for example, on a roadway 116, a sidewalk 118, a parking lot, a driveway, a rooftop, an aircraft runway, a bridge, and the like. The radiation detection device 112 can comprise a first positioning device 120. The first positioning device 120 can position a first radiation source 122 and first radiation sensor 124 at an angle 126 to the surface 110. For example, the first positioning device 120 can comprise one or more antennae configured to emit and receive electromagnetic waves.
[0020] In one aspect, the radiation detection device 112 can comprise a second positioning device 121 for a second radiation source 123 and a second radiation sensor 125. In one aspect, the first radiation source 122 can be oriented along the direction of travel 127, and the second radiation source 123 can be oriented perpendicular to the direction of travel 127. Accordingly, the first radiation source 122 can detect features oriented perpendicular to the direction of travel 127, and the second radiation source 123 can detect features oriented parallel to the direction of travel 127. Though not shown, the radiation detection device 112 can comprise additional radiation sources oriented in various directions. Additionally, in some configurations, the radiation detection device 112 can be limited to either the first radiation source 122 or the second radiation source 123.
[0021] The radiation sources 122 and 123 can emit electromagnetic radiation, such as electromagnetic waves. For example, the radiation sources 122 and 123 can emit radio waves (e.g., electromagnetic radiation with frequencies between 4 kHz to 300 GHz). The radiation sensors 124 and 125 can detect the radiation from the radiation sources 122 and 123 after the radiation is reflected (e.g., backscattered) back to the radiation sensor 124. In one aspect, the radiation detection device 112 can comprise one or more RADAR devices. For example, the radiation detection device 112 can comprise a miniaturized RADAR device. Furthermore, the radiation detection device 112 can comprise a signal processor configured to process signals received at the radiation sensors 124 and 125. Additionally, the radiation detection device 112 can comprise an input/output interface for communicating with other devices, such as the optical imaging device 108, location device 106, a computing device 128, and/or the like. The radiation detection device 112 can comprise memory for storing the radiation data. Additionally, the radiation data can be provided to and stored at the computer device 128 or other location.
[0022] In one aspect, the system 102 can comprise a computing device 128
communicatively coupled to the radiation detection device 112, optical detection device 108, and/or location device 106. The computing device 128 can be configured to receive radiation data from the radiation detection device 112. For example, the radiation data can be based on the radiation backscattered from the surface 110. The computing device 128 can be configured to identify in the radiation data a feature 114 of the surface 110. For example, the computing device 128 can be configured to identify from the radiation data a signal strength value above a threshold value. The signal strength value can indicate the amount of radiation backscattered from one or more locations on the surface. Additionally, computing device can be configured to identify a pattern of the signal strength values over time.
[0023] The computing device 128 can further be configured to receive the optical imaging data associated with the surface 110 from the optical detection device 108. For example, the computing device 128 can be configured to receive distance of the feature 114 from the light detection and ranging device and spatial characteristics of the feature 114 from the light detection and ranging device. The computing device 128 can also be configured to identify a portion of the optical imaging data comprising the feature 114. In another aspect, the computing device 128 can be configured to associate a geographic location with the feature. For example, the computing device 128 can be configured to receive geographic data from the location device 106 (e.g., at least one of the geographic positioning system device and the inertial reference system device). The computing device 128 can be further configured to provide a representation of the feature based on the optical imaging data and the associated geographic location. In one aspect, the representation of the feature 114 comprises a three dimensional computer generated model.
[0024] FIG. 2 is a diagram illustrating an exemplary system 202 for feature detection. In one aspect, the system 202 can comprise a transport device 204. For example, the transport device 204 can comprise an aircraft. In one aspect, the aircraft can comprise an unmanned aerial system. In another aspect, the transport device 204 can comprise a ground vehicle, such as a car, truck, van, and the like. The transport device 204 can be coupled to measuring devices, such as the optical detection device 108, location device 106, and radiation detection device 112 described herein. One or more of the measuring devices can be oriented at a look-angle 206. The look angle 206 can determine a portion 208 of a surface measured by the measuring devices. For example, the radiation detection device 112 can emit radiation (e.g., electromagnetic waves) at the look angle 206 and receive reflected radiation at the look angle 206. The height 210 (e.g., altitude) of the transport device 204 above the surface, the look angle 206, the velocity of the transport device 204, the geographic location of the transport device 204, and other measurements can be used to determine the geographic location of one or more features on the surface.
[0025] The systems 102 and 202 are described in further detail as follows. In one embodiment, the radiation detection device 112 can collect backscattered radiation off a surface, and computer-executable instructions can permit data processing and related feature extraction. In another aspect, software can be configured (e.g., programmed) to discriminate signal from noise and generate imaging data indicative of imagery of a monitored solid surface. The software can be configured (e.g., programmed or coded) according to one or more various methods of the disclosure. Such methods can be grouped into two broad categories: discrimination, and classification. The discrimination process can broadly encompass processing, filtering, analysis, and identification of features. These
preprocessing steps can occur in real-time or near-real-time. However, the raw radiation data, optical imaging data, and the geocoding information can be retained for later processing and analysis.
[0026] Classification algorithms can evaluate the radiation data for several broad categories of data, such as surface texture, surface cracking, and other features on the surface that are not intrinsically part of the surface. Surface texture can comprise roughness, grooving, and texture. Surface texture data, for example, can provide
information about how the surface of road pavement creates friction with the tire rubber in contact with road for safety purposes. The surface structure can broadly comprise the joints found in pavement intentionally placed to permit the expansion and contraction of pavement surfaces as well as the undesirable cracking of pavement that occurs due to wear and tear of traffic and weather. Cracking can take on a multitude of forms because the damage can occur in many different ways. The third category of classification can comprise identifying features on the surface of pavement that are not part of the pavement structure, such as gravel, stray items that are lost from passing traffic, materials blown onto road surfaces from weather, and other unexpected items. Both the discrimination and classification methods can be based on mathematical and statistical filters derived from machine learning and expert rales.
[0027] In one implementation, the radiation data can augment data received from optical imaging devices, such as optical and laser sensors. One or more aspects of the disclosure can discriminate— e.g., output data that can permit identification of specific features— concrete pavement from asphalt pavement. In one aspect, the joints and cracks in pavement being identified can be smaller than the theoretical resolution of the radar imaging sensors. In one aspect, two types of theoretical resolution can be contemplated. The first includes the real resolution of the radar sensor determined by the antenna size and frequency at which it operates (e.g., L-band, X-band, and Ku-band). The second form of operation, using a synthetic aperture, can support higher resolution than that of real aperture, in which the motion of the vehicle is used to synthesize a larger effective antenna size with resultant enhancement of resolution. As an illustration, a radiation detection device emitting electromagnetic waves of a frequency of 16 cm (an example L-band frequency) can detect features much smaller, even sub-millimeter in size when the radiation detection device is configured for operation with a synthetic aperture.
[0028] The radiation detection device 112 can be configured as an active remote sensing technology. Similar to the light emissions from a laser profiler, the radar detection device 112 can emit energy in the form of electromagnetic waves, albeit at a different wavelength than the optical detection device 108. Both sensors can operate as a radiation source and thereby illuminate a target surface. In addition, the radiation detection device 112 can detect reflected radiation (or backscatter) and data indicative of strength of the radiation. The radiation detection device 112 can also record an amount of backscatter. In one aspect, at least one feature of the collected data can serve as a fingerprint, or can be indicative, of a specific morphological feature (e.g., a crack or a boundary, such as joint) of the monitored surface. Such fingerprint provided by the data indicative of the reflected energy can be utilized to characterize the received signal and to create an image.
[0029] In one aspect, both the optical detection device 108 and radiation detection device 112 can emit many pulses of energy per second to create an image. In certain aspects, pulses can be transmitted at rates of the order of KHz. In both cases, high-precision clocks on the optical detection device 108 and radiation detection device 112 can record the time for each pulse of energy to return. Such information can be employed to calculate a distance or range to the feature reflecting the energy. Accordingly, these technologies can be referred to in manners that include the term "ranging": Light Detection and Ranging (LIDAR) and Radio Detection and Ranging (RADAR).
[0030] In certain implementations, solid surfaces that can be probed can comprise asphalt pavement and concrete pavement in a variety of conditions of such surfaces. In one aspect, the radiation detection device 112 can be configured to emit radio waves in the Ky- band, X-band, and/or other electromagnetic radiation bands.
[0031] in one aspect, the strength of the returned radiation (e.g., visible radiation, microwave radiation) signal can depend on the characteristics of the imaged target. For example, certain surfaces can reflect laser and microwave energy very well, whereas other surfaces can scatter energy in a manner that does not return to the sensor and thus reduces the returned signal. In general, when laser or microwave energy impinges onto a smooth surface at an angle, the energy can be reflected away; whereas a rough surface can diffusively scatter the energy in various directions, returning a portion of the incident energy back to a sensor (e.g., the first radiation sensor 124 and/or second radiation sensor 125) that is part or functionally coupled to the radiation detection device 112. The radiation detection device 112 also can comprise or be functionally coupled to a RADAR source (e.g., a source of radio waves, such as electromagnetic radiation having wavelengths ranging from about 1 cm to about lm).
[0032] Energy scattered away from the radiation detection device 112 typically results in less returned energy and, thus, detected signals with lower strength. In one aspect, a low- strength signal can be represented as dark feature in an image. Accordingly, a rougher surface can scatter microwaves and, thus, can create a region of gray in the image. In another aspect, the brightest features in an image of a probed surface can occur in a scenario in which the radiation incident on the probed surface is reflected directly back to the sensor. Thus, in such scenario, roughness of the probed surface can be recorded as various shades of gray in an image.
[0033] Features made of different materials can interact differently with radiation (e.g., radar microwaves and infrared laser energy). For example, metal can reflect microwaves very well, which is why RADAR is good for imaging aircraft. As another example, concrete can be highly reflective of both microwaves and laser energy. In one aspect, asphalt tends to absorb a portion of both microwave and laser energy while asphalt's coarse surface can scatter the energy of both sensors. [0034] In one aspect, the look angle (e.g., the angle at which the radar sensor images a surface, such as look angle 126 or look angle 206) can determine how the energy is scattered and reflected back (or backscattered) and, thus, how a resulting image can be formed. For radiation in the microwave portion of the electromagnetic spectrum, the resulting image can be referred to as a microwave image. In one embodiment, RADAR 's microwave energy obliquely striking the surface (e.g., a paved surface) can have the microwave energy reflected back to the RADAR sensor by the cracks in such surface, thus making one or more of the cracks detectable in both the microwave return signal and the microwave image.
[0035] In one aspect, a radiation detection device 112 can be mounted to the transportation device 104 (e.g., in the bed of a truck) pointed backwards and oriented at an angle (e.g., 45 degrees) towards the pavement. Such configuration is referred to as the "real aperture" because the "aperture" is the size of the antenna which creates a footprint on the ground that is determined by diffraction theory. For example, the first positioning device 120, first radiation source 122, and first radiation sensor 124 can be configured for detection based on a real aperture. In another aspect, the radiation detection device 112 can be positioned to point down to the side of the vehicle, such that the radiation source is oriented perpendicular to its direction of travel. Such configuration, which is referred to as "synthetic aperture radar" can permit utilization of the vehicle's movement to collect sequential sets of radiation data of a probed solid surface. These sequential images can then be synthesized into a single image that achieves the high resolution (e.g., cm-range) associated with a very large "synthesized" aperture. For example, the second positioning device 121, second radiation source 123, and second radiation sensor 125 can be configured for detection based on a synthetic aperture.
[0036] In one aspect, a radiation detection device 112 configured with a synthesized aperture can detect minute cracking oriented orthogonal to the radiation detection unit 112 side-look angle (e.g., look angle 206). For example, the theoretical resolution of a radiation detection device 112 configured to emit radiation in the Ku-band (e.g., from about 12 GHz to about 18 GHz) can be coarser than the cracks detected and such cracks can be clearly visible in imagery produced from collected imaging data (e.g., strength ofback scattered radiation). In one aspect, a radiation detection device 112 configured with a synthetic aperture can operate from an aircraft flown at, for example, 1,000 feet above ground level.
[0037] In one aspect, a radiation detection device 112 configured in the real aperture configuration can detect linear cracking that can be oriented orthogonal to the radar angle of view (e.g., look angle 126). In one aspect, an exemplary radiation detection device 112 in the real aperture configuration can be designed to detect cracks as a transport device 104 carrying the detection equipment moves over a solid surface (e.g., pavement) being probed. Radiation data collected in this configuration can be plotted as a "waterfall" of pavement data continuously moving. In other words, a feature, such as a crack, can appear as a smear through time, with the image having time on one axis and the crack "signature" on the other axis (see, for example, FIGs. 3-5). In one aspect, a "signature" can represent
electromagnetic wave energy that has reflected back to the radiation detection device 112.
[0038] In one aspect, Global Positioning System (GPS) coordinates can be associated, or linked, to these signatures, and, as a result, geo coding of the crack signatures can be attained. In certain aspects, geocoding the radiation data, such as data indicative of the signal strength of backscattered radiation, with other sensor data (data acquired by a video camera, data generated by a LIDAR profiler, etc.) can comprise several parameters. For example, in one aspect, a first set of parameters of the parameters can be the xyz
coordinates calculated from the global positioning system. These xyz coordinates can be transformed from the native earth-centered-earth-fixed spherical coordinate system that GPS utilizes into local plane coordinates used by local mapping systems. Another set of three parameters can comprises orientation parameters from the inertial reference system, which can provide the orientation of the radar within the coordinate space (see, e.g., FIG. 2). The orientation parameters can comprise a set of three-dimensional orientation angles describing the look angle of the radar system (see, e.g., FIG. 2). Yet another parameter of the seven parameters can be a time stamp (e.g., a time record) obtained, for example, from the GPS. The time stamp can be applied to all other data sets, including but not limited to optical imaging data (e.g., video imagery and LIDAR profiler data), thereby allowing the integration of the complete pavement imaging solution.
[0039] In one aspect, the system 102 can collect a substantive amount of data. For example, amounts data of the order of a few gigabytes per minute can be common for the radiation detection device 112. In one aspect, such data can be rich and complex, particularly, yet not exclusively, because a portion of the data can be associated with roughness of a probed surface (e.g., pavement) that do not present cracking or other morphological anomaly or feature that is intended to be detected. In certain implementations, one or more methods can be implemented to analyze the collected data, identifying and/or removing contributions to the data arising from such roughness (referred to as "roughness data"), and yielding information associated directly with cracking. Radiation data collected in the real aperture configuration can be utilized to detect surface roughness. Such integration of real aperture based radiation detection can provide a very useful sensor in an integrated pavement management system, the sensor reducing manual inspection time, and other human intervention time, since the cracks can be pre-detected automatically.
[0040] As an illustration, an exemplary radiation detection device 112 can comprise a RADAR device weighing 3.5 pounds and with an antenna is seven inches long. Such RADAR can consume several Watts to operate. Other RADAR units having less weight than 3.5 pounds also can be utilized. For example, RADAR units embodied on one or more integrated circuits that include antenna(s) can be utilized. For example, a radiation detection device 112 (e.g., comprising a RADAR) can be mounted in the bed of a truck on a wooden scaffold about five feet above the ground. The antenna can be pointed at a 45 degree angle towards the pavement to create an oblique view from the back of the vehicle. By way of illustration, the radiation detection device 112 can comprise a RADAR unit manufactured by ImSAR or Artemis.
[0041] Applications are not limited to road pavement, but may also include bridges, aircraft runways, building roofs, parking lots, etc. In one aspect, the optical detection device can comprise a laser profiler (or LIDAR profiler) pointed at the solid surface (e.g., pavement) that can permit characterizing the morphology of the solid surface.
[0042] The location device 106 can comprise a GPS for xyz coordinate geolocation augmented with an inertial reference system (IRS) that can provides xyz coordinate information when a GPS fix is unavailable. Radiation data obtained with the radiation detection device 112, optical imaging data obtained from the optical detection device 108, location data obtained from the location device 106, and other data can be output to a user device, e.g., a device of a pavement engineer, for analysis of imagery associated with the data.
[0043] In one aspect, the radiation detection device 112 can be pointed ahead, behind, or to the side of the transportation device 104 so as to form an angle (e.g., look angle 126) allowing the electromagnetic energy to concentrate in the solid surface and morphological features thereof, such as pavement cracks, and be reflected back (or back scattered) to an antenna in the radiation detection device 112. In one aspect, the optical detection device 108 can comprise a video camera pointed down towards the surface (e.g., the pavement) and a LIDAR profiler. The location device 106 can comprise a GPS and IRS device. The location device 106 can provide the data to geo-locate all or a portion of the collected imaging data, and to synchronize, using time and coordinate stamps, for example, one or more data sets having such data.
[0044] In one embodiment, the radiation detection device 112 can be mounted on the front or back of the transportation device 104, oriented at an angle (e.g., look angle 126) down towards the solid surface (e.g., pavement). As the transportation device 104 moves, the radiation detection device 112 can emit electromagnetic pulses, for example, that are scattered and reflected back to the radiation detection device 112. Because of the way the electromagnetic pulses are concentrated and reflected back to the radiation detection device 112, such stronger returns can be indicative of the location of a specific morphology feature (e.g., a crack) of the solid surface. Information associated with signal strength of the returned radar pulses can form a fingerprint indicative of roughness of the solid surface (e.g., pavement). Knowledge or access to the location of the morphology feature (e.g., the crack) from the electromagnetic pulse, the speed of the vehicle and spatial coordinates thereof, the system 102 can geo-locate the crack location and surface roughness with the other data being collected, which can comprise the video imagery and LIDAR profile data.
[0045] In one aspect, the radiation detection device 112 can provide a triggering mechanism that indicates the presence of a crack or anomaly in the pavement, flagging the integrated GPS/IRS, video imagery, and LIDAR profile data for follow up investigation by the pavement engineer. One exemplary operation of the disclosure can comprise the radiation detection device 112 performing a first inspection of the solid surface (e.g., a road) and flagging video imagery obtained with the optical detection device 108 (e.g., with the video camera and LIDAR-based imaging data). Such flagging step, or identification step, can simplify inspection and interpretation of conditions of the surface, such as pavement, by a field engineer (e.g., a pavement engineer).
[0046] In one implementation a conventional survey vehicle suitable (e.g., designed) for pavement imaging and profiling can be configured or fitted with the radiation detection device 112 by adding, for example, the radiation detection device 112, in either a forward or backward looking direction, in a manner that at least a portion of the solid surface (e.g., a lane of pavement) in which the survey vehicle moves also is being imaged with the radiation detection device 112 and other components of the disclosure.
[0047] FIG. 3 is a plot illustrating exemplary radiation data. Specifically, FIG, 3 is a waterfall plot showing the received signal from a radiation detection unit (in this case, a RADAR) with distance traveled in the y axis and time traveled on the x axis. The plot illustrates radiation data collected as the transportation device moves past two well-defined metallic features. The background color (dark gray) represents random noise of the pavement and the speckled pattern is caused by the roughness of the pavement. The lighter tones of grey indicate returned signals: one for each of the metal features. Such type of graph can be described as a "waterfall" plot because the plot tracks the signal through time and over distance. Return signals of the strength represented by the metal features can be detected with automated signal processing software. In one aspect, methods for analyzing the radiation data can utilize a matched filter approach to extract the signal from the clutter (or noise). These return signals detected in the radiation data can be geo-referenced, or geocoded, by linking coordinate information from the global positioning system and a backup inertial navigation system. Thus, cracks and other features can be geo-referenced, or geocoded. Accordingly, a concrete sidewalk with well-defined joints can be probed or monitored. Driving the transport vehicle over the sidewalk can provide radiation data indicating clear return signals as features in a waterfall plot.
[0048] FIG, 4 is another plot illustrating radiation data of an example concrete sidewalk. Specifically, FIG. 4A is an example concrete sidewalk, and FIG. 4B is a waterfall plot illustrating cracks in concrete sidewalk. Concrete sidewalk cracks can be clearly observed in the plot as a succession of lines at the lower range scale (e.g., less than 30). Collected data and associated analysis demonstrate that real aperture based detection can be accomplished at close range. In one aspect, detection of the joints and cracking on concrete roadways can detect perpendicular line of sight where the radar antenna is oriented. In another aspect, alternative antennae configuration(s) and/or orientation(s) can detect cracking in other perpendicular directions. For instance, steering the antenna beam in various directions can permit the radar beam to sweep pavement at continuously or nearly continuously changing angles as the vehicle drives through the pavement. In one aspect, switched-beam electronically scanned arrays on monolithic microwave integrated circuits can be or can comprise a radiation detection device configured to operate with antenna sweeping. In another aspect, two or more radiation detection devices can emit radiation in orthogonal directions to detect features oriented in different directions.
[0049] FIGs. 5A and SB are plots illustrating radiation data of exemplary asphalt roads. The results of the asphalt imaging test can be best characterized by comparing two scenes: one for a pristine asphalt road with no discernible cracks as shown in FIG. 5 A, and one for a road with a maze of "spider" cracks as shown in FIG. SB. Waterfall images for asphalt pavement in case of heavily cracked (upper)and new condtion (lower). In comparison to the cracking detected in concrete surfaces, cracking detected in radiation data for asphalt can be more subtle. In one aspect, cracks of about 2 centimeters and larger can be identified. The substantive volume of data from the RADAR sensor can be processed according to methodology(ies) described herein. Such techniques can be suitably modified so that the radiation signal returns can be classified and geocoded to show the crack information in real time with the other data sets from the imagery and LIDAR sensors. Such an approach can utilize statistical filters and artificial neural networks to classify features based on the morphology and importance of the feature for later examination by pavement engineers.
[0050] In one aspect, the radiation detection device can automatically detect the presence and characteristics of the pavement cracking, and then integrate this information with data from other sensors such as camera and laser pavement imaging/profiling systems. Such integrated approach can provide (e.g., output) data that can assist pavement engineers in their visual inspection of camera imagery. In one aspect, the data can comprise data and/or metadata indicative of images requiring inspection based on the detected cracks. For instance, the data and/or metadata can indicate (e.g., flag) such images. This improvement in efficiency, in one aspect, renders the disclosed imaging technology advantageous over conventional solutions.
[0051] FIGs. 6A and 6B illustrate radiation data collected from an aircraft. FIG. 6A is an exemplary image of a surface received by an optical imaging device. FIG. 6B shows exemplary radiation data overlaid on the image. The exemplary radiation data can be received from a synthetic aperture radar (SAR). A system configured on a small aircraft, such as a Cessna 172, can be employed to collect data. Theoretically, the operation of the SAR from an aircraft can yield the same or similar imaging results as from a truck. In addition to waterfall plots, richer high-resolution images can be produced from the data collected by the aircraft. As in the case of the real-aperture radar, SAR can easily see sidewalk cracks in concrete orientated orthogonal to the antenna orientation.
[0052] In one implementation, the SAR image was detected by an aircraft flying along a direction parallel to the sidewalk cracks detected. In such implementation, cracks that run parallel to the flight line can be easily observed, whereas those sidewalk cracks running perpendicular to the flight direction may be more difficult to detect. In the real aperture based configuration, cracking was detected as return signals when passing over concrete cracks. Such approach includes data processing method(s) for removing signal noise that can mask fingerprint(s) of cracks. In one aspect, real aperture based detection can yield a signal that can be traced through time and distance on a waterfall plot.
[0053] In another aspect, as shown in FIG. 6B, SAR can generate a real image that appears similar to an air photograph. Such forms of detection can be geo-referenced to permit integration of other pavement imaging technologies with real aperture and synthetic aperture technologies. In one aspect, the image can indicate cracks or seams in the surface. As shown in FIG. 6B, the regular pattern of white lines can illustrate seams 602 or other lines such as cracks detected on the sidewalk. Without intending to be limited by theory, modeling, and/or simulation, it is believed that greater roughness of the asphalt compared to concrete surfaces can result in lower signal-to-noise ratio for the data obtained from experiments in asphalt. Pavement cracks and joints were best detected when orientated perpendicular to the antenna orientation.
[0054] One or more methods for analyzing imaging data obtained according to aspects of the disclosure can process (e.g., manage, operate, and/or the like) the substantive size of the datasets, decimate data, filter/reduce/eliminate noise from signal, classify signal as false positive and true positive, extract true positive, and then attempt to classify true positive signals.
[0055] Airborne experiments of the disclosure demonstrate that pavement inspection can be performed from transportation devices other than terrestrial vehicles. For example, a miniaturized SAR or other radiation detection device can be operated from unmanned aerial systems flying at lower altitudes. It should be appreciated that airborne SAR can be performed on platforms in deep space (e.g., satellites) and high flying jets, for example. The close-range SAR of the disclosure, and related embodiments of system and methods, can synthesize the aperture at very short distances, low altitudes, and still travel enough distance on the platform to so synthesize the aperture.
[0056] In view of the aspects described hereinbefore, at least one exemplary method that can be implemented in accordance with the disclosed subject matter can be better appreciated with reference to the flowcharts in FIG. 7 and FIG. 8. For purposes of simplicity of explanation, the exemplary methods disclosed herein are presented and described as a series of acts; however, it is to be understood and appreciated that the claimed subject matter is not limited by the order of acts, as some acts may occur in different orders and/or concurrently with other acts from that shown and described herein. For example, the various methods or processes of the subject disclosure can alternatively be represented as a series of interrelated states or events, such as in a state diagram.
Furthermore, not all illustrated acts may be required to implement a method in accordance with the subject disclosure. Further yet, two or more of the disclosed methods or processes can be implemented in combination with each other, to accomplish one or more features or advantages herein described.
[0057] It should be further appreciated that the exemplary methods disclosed throughout the subject specification can be stored on an article of manufacture, or computer-readable medium, to facilitate transporting and transferring such methods to a computing device (e.g., a desktop computer, a mobile computer, a mobile telephone, a blade computer, a programmable logic controller, and the like) for execution, and thus
implementation, by a processor of the computing device or for storage in a memory thereof.
[0058] FIG. 7 is a flowchart illustrating an exemplary method for feature detection. At block 702, radiation can be provided to the surface. For example, an antenna can emit electromagnetic radiation. In one aspect, at least a portion of the radiation can yield the radiation backscattered from a surface in response to interacting with the surface. For example, the surface can comprise a surface of at least one of a road, a parking lot, a driveway, a rooftop, an aircraft runway, a bridge, a sidewalk, and the like.
[0059] At block 704, radiation data can be received based on radiation backscattered from a surface. At block 706, a feature of the surface can be identified in the radiation data. The feature can comprise at least one of a crack, a hole, an indentation in the surface, and the like. For example, a signal strength value above a threshold value can be identified from the radiation data. In one aspect, the signal strength value can indicate the amount of radiation backscattered from one or more locations on the surface. In one aspect, the feature can reflect a greater amount of radiation back to a radiation detection device than areas of the surface surrounding the figure. As another example, a pattern of the signal strength values over time can be identified. An example pattern can comprise a series of signal strengths above a threshold value. The threshold value can comprise a specified signal strength.
[0060] At block 708, optical imaging data associated with the surface can be received. For example, data from a light detection and ranging device can be received. In one aspect, data from the light detection and ranging device can comprise distance of the feature from the light detection and ranging device and/or spatial characteristics of the feature. The spatial characteristics can comprise height, length, depth, shape, edge information, and the like.
[0061] At block 710, a portion of the optical imaging data comprising the feature can be identified. At block 712, a geographic location can be associated with the feature. For example, geographic data can be received from at least one of a geographic positioning system device and an inertial reference system device. The geographic data can be associated with a time and/or with other data, such as radiation data and optical imaging data. At block 714, a representation of the feature based on the optical imaging data and the associated geographic location can be provided. For example, the representation of the feature can comprise a three dimensional computer generated model. The three dimensional model can be based on, for example, data from a LIDAR device and/or data from a radiation detection device. In another aspect, the representation can comprise a two dimensional image, such as a photograph of the feature.
[0062] FIG. 8 illustrates a flowchart of an exemplary method 800 for monitoring a solid surface via close-range SAR in accordance with one or more aspects of the disclosure. In one embodiment, the exemplary system 102 described in connection with FIG. 1 can carry out the exemplary method 800. At block 802, backscattered radiation from a solid surface can be collected at a radiation detection device (e.g., RADAR). Block 802 can be referred to as the collecting backscattered radiation step and can comprise delivering radiation onto the surface, at least a portion of the radiation yielding the backscattered radiation in response to interacting with the surface. In another aspect, the collecting backscattered radiation step can comprise generating a time series of data records associated with signal strength of the backscattered radiation.
[0063] At block 804, at least one feature associated with morphology of the solid surface can be extracted, by a computing device (e.g., computing device 901) from data yielded from the collected backscattered radiation. Block 804 can be referred to as an extracting step. In one aspect, the extracting step can comprise analyzing the time series (e.g., the radiation data over time, example illustrations of such are found in the waterfall plots of FIGs. 3-5) and, in response, identifying a signal level above a threshold.
[0064] At block 806, optical imaging data associated with the solid surface can be collected at an optical detection device (e.g., a video camera, a CCD camera, a LDDAR unit, etc.). At block 808, the at least one feature can be associated, by the computing device, with at least a portion of the optical imaging data associated with the solid surface. At block 810, the at least one feature with location data can be geocoded, by the computing device. Block 810 can be referred to as the geo coding step. In one aspect, the geocoding step can comprise associating the at least one feature with the location data, the location data comprising one or more of global positioning system based data or inertial reference system based data.
[0065] In one embodiment, the exemplary method 800 also can comprise associating, by the computing device, the at least one feature with at least a portion of the optical imaging data associated with the solid surface.
[0066] Various advantages or efficiencies emerge from the foregoing disclosure. In scenarios in which the solid surface is or comprises pavement, it should be appreciated that in conventional maintenance solutions, visual inspection of the pavement typically requires a trained pavement engineer to visually inspect one or more frames of video imagery to detect and then analyze pavement cracking and surface conditions. In contrast, the disclosed radiation detection, when used in conjunction with video imaging techniques, for example, can permit the automated detection of pavement cracking and conditions, thus mitigating or avoiding manual inspection or other human intervention, and analysis of video imagery.
[0067] As described herein, in certain implementation, the disclosure can be advantageous over conventional technologies to various service sectors associated with transportation. For example, pavement engineers, pavement imaging companies, city and county engineers, and state and federal departments of transportation can benefit from one or more of the aspects or features of the disclosure by achieving efficiencies not available through conventional technologies. The disclosure also can have other economic implications, such as better utilization of taxes resulting from reduced costs of pavement inspection and better pavement management strategies. Private sector and public sector organizations that utilize roads can benefit from the invention by being able to implement more adequate or efficient logistics with better pavement management systems.
[0068] In addition or in the alternative, it is noted that aspects of the disclosure contemplate use of electromagnetic waves at substantively close ranges - closer than theoretical "range gates" that suggest the minimum distance the sensor can be from the feature being imaged. It is also noted that conventional SAR solutions can be mounted on satellites and aircraft. In contrast, in certain aspects of the disclosure, the close-range SAR can accomplish radar imaging while mounted on a terrestrial vehicle, such as a trailer towed behind a truck. It is further noted that the disclosed radiation detection attained in a system mounted on a terrestrial vehicle can comprise mapping of radiation data to optical imaging data and/or time stamps.
[0069] FIG. 9 illustrates a block diagram of an exemplary operating environment 900 having a computing device 901 that enables various features of the disclosure and performance of the various methods disclosed herein. Computing device 901 can embody a data processing unit that can be part of the radiation detection device or can be functionally coupled thereto. This exemplary operating environment 900 is only an example of an operating environment and is not intended to suggest any limitation as to the scope of use or functionality of operating environment architecture. Neither should the exemplary operating environment 900 be interpreted as having any dependency or requirement relating to any one or combination of functional elements (e.g., units, components, adapters, or the like) illustrated in such exemplary operating environment.
[0070] The various embodimen ts of the disclosure can be operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that can be suitable for use with the systems and methods of the disclosure comprise personal computers, server computers, laptop devices or handheld devices, and multiprocessor systems. Additional examples comprise mobile devices, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that comprise any of the above systems or devices, and the like.
[0071] The processing effected in the disclosed systems and methods can be performed by software components. In one aspect, the disclosed systems and methods can be described in the general context of computer-executable instructions, such as program modules, being executed by one or more computers, such as computing device 901, or other computing devices. Generally, program modules comprise computer code, routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The disclosed methods also can be practiced in grid-based and distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules can be located in both local and remote computer storage media including memory storage devices. [0072] Further, one skilled in the art will appreciate that the systems and methods disclosed herein can be implemented via a general-purpose computing device in the form of a computing device 901. The components of the computer 901 can comprise, but are not limited to, one or more processors 903, or processing units 903, a system memory 912, and a system bus 913 that couples various system components including the processor 903 to the system memory 912. in the case of multiple processing units 903, the system can utilize parallel computing.
[0073] In general, a processor 903 or a processing unit 903 refers to any computing processing unit or processing device comprising, but not limited to, single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory. Additionally or alternatively, a processor 903 or processing unit 703 can refer to an integrated circuit, an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. Processors or processing units referred to herein can exploit nano-scale architectures such as, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of the computing devices that can implement the various aspects of the subject disclosure.
Processor 903 or processing unit 903 also can be implemented as a combination of computing processing units.
[0074] The system bus 913 represents one or more of several possible types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures can comprise an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, an Accelerated Graphics Port (AGP) bus, and a Peripheral Component Interconnects (PCI), a PCI-Express bus, a Personal Computer Memory Card Industry Association (PCMCIA), Universal Serial Bus (USB) and the like. The bus 913, and all buses specified in this specification and annexed drawings also can be implemented over a wired or wireless network connection and each of the subsystems, including the processor 903, a mass storage device 904, an operating system 90S, feature detection software 906, feature detection data 907, a network adapter 908, system memory 912, an Input/Output Interface 910, a display adapter 909, a display device 911, and a human machine interface 902, can be contained within one or more remote computing devices 914a,b,c at physically separate locations, functionally coupled (e.g., communicatively coupled) through buses of this form, in effect implementing a fully distributed system.
[0075] Feature detection software 906 can configure the computing device 901, or a processor thereof, to perform one or more of imaging data collection, analysis of at least a portion of such data for feature extraction associated with morphological features of a probed solid surface, and geo coding of at least a portion of the imaging data and location data in accordance with aspects of the disclosure. Feature detection software 906 can be retained in a memory as a group of computer-accessible instructions, e.g., computer- readable instructions, computer-executable instructions, or computer-readable computer- executable instructions. In one aspect, the group of computer-accessible instructions can encode one or more methods of the disclosure. In another aspect, the group of computer- accessible instructions can encode various formalisms for feature extraction, such as wavelet analysis, autonomous classification, or the like. Certain implementations of feature detection software 906 can include a compiled instance of such computer-accessible instructions, a linked instance of such computer-accessible instructions, a compiled and linked instance of such computer-executable instructions, or an otherwise executable instance of the group of computer-accessible instructions.
[0076] Feature detection data 907 can comprise various types of data that can permit implementation (e.g., compilation, linking, execution, and combinations thereof) of the feature detection software 906. In one aspect, feature detection data 907 can comprise imaging data described herein, such as data available for waterfall plots, and data structures containing information associated with imaging data, location data, and geocoding.
[0077] The computing device 901 typically comprises a variety of computer readable media. Exemplary readable media can be any available media that is accessible by the computer 901 and comprises, for example and not meant to be limiting, both volatile and non-volatile media, removable and non-removable media. The system memory 912 comprises computer readable media in the form of volatile memory, such as random access memory (RAM), and/or non-volatile memory, such as read only memory (ROM). The system memory 912 typically contains data (such as a group of tokens employed for code buffers) and/or program modules such as operating system 905 and feature detection software 906 that are immediately accessible to and/or are presently operated on by the processing unit 903. Operating system 905 can comprise OSs such as Windows operating system, Unix, Linux, Symbian, Android, iOS, Chromium, and substantially any operating system for wireless computing devices or tethered computing devices.
[0078] In another aspect, computing device 901 can comprise other removable/nonremovable, volatile/non-volatile computer storage media. As illustrated, computing device 901 comprises a mass storage device 904 which can provide non-volatile storage of computer code (e.g., computer-executable instructions), computer-readable instructions, data structures, program modules, and other data for the computing device 901. For instance, a mass storage device 904 can be a hard disk, a removable magnetic disk, a removable optical disk, magnetic cassettes or other magnetic storage devices, flash memory cards, CD-ROM, digital versatile disks (DVD) or other optical storage, random access memories (RAM), read only memories (ROM), electrically erasable programmable readonly memory (EEPROM), and the like.
[0079] Optionally, any number of program modules can be stored on the mass storage device 904, including by way of example, an operating system 905, and feature detection software 906. Each of the operating system 905 and feature detection software 906 (or some combination thereof) can comprise elements of the programming and the feature detection software 906. Data and code (e.g., computer-executable instruction(s)) can be retained as part of feature detection software 906 and can be stored on the mass storage device 904. Feature detection software 906, and related data and code, can be stored in any of one or more databases known in the art. Examples of such databases comprise, DB2®, Microsoft® Access, Microsoft® SQL Server, Oracle®, mySQL, PostgreSQL, and the like. Further examples include membase databases and flat file databases. The databases can be centralized or distributed across multiple systems.
[0080] In another aspect, a user can enter commands and information into the computing device 901 via an input device (not shown). Examples of such input devices comprise, but are not limited to, a camera; a keyboard; a pointing device (e.g., a "mouse"); a microphone; a joystick; a scanner (e.g., barcode scanner); a reader device such as a
radiofrequency identification (RFID) readers or magnetic stripe readers; gesture-based input devices such as tactile input devices (e.g., touch screens, gloves and other body coverings or wearable devices), speech recognition devices, or natural interfaces; and the like. These and other input devices can be connected to the processing unit 903 via a human machine interface 902 that is coupled to the system bus 913, but can be connected by other interface and bus structures, such as a parallel port, game port, an IEEE 1394 Port (also known as a Firewire port), a serial port, or a universal serial bus (USB).
[0081] In yet another aspect, a display device 911 also can be functionally coupled to the system bus 913 via an interface, such as a display adapter 909. It is contemplated that the computer 901 can have more than one display adapter 909 and the computer 901 can have more than one display device 911. For example, a display device can be a monitor, an LCD (Liquid Crystal Display), or a projector. In addition to the display device 911, other output peripheral devices can comprise components such as speakers (not shown) and a printer (not shown) which can be connected to the computer 901 via Input/Output Interface 910. Any step and/or result of the methods can be output in any form to an output device. Such output can be any form of visual representation, including, but not limited to, textual, graphical, animation, audio, tactile, and the like.
[0082] As illustrated, one or more sensor(s) 918 can be functionally coupled to the system bus 913 through an I/O interface of the one or more I/O interface^) 910. The sensor(s) can comprise radiation detection devices (e.g., RADAR) and/or optical detection devices (e.g., LIDAR, camera system). Through the functional coupling through such I/O interface, the one or more camera(s) can be functionally coupled to other functional elements of the computing device. In one embodiment, the I/O interface, at least a portion of the system bus 913, and system memory 912 can embody a data collection unit that can permit receiving data acquired by at least one of the one or more sensor(s) 918. Such data collection unit can be an analog unit or a unit for collection of digital data, or a combination thereof. In case of an analog unit, processor 903 can provide analog-to-digital functionality and decoder functionality, and the I/O interface can include circuitry to collect the analog signal received from at least one sensor of the one or more sensor(s) 918.
[0083] The computing device 901 can operate in a networked environment (e.g., an industrial environment) using logical connections to one or more remote computing devices 914a,b,c, and equipment 916. By way of example, a remote computing device can be a personal computer, portable computer, a mobile telephone, a server, a router, a network computer, a peer device or other common network node, and so on. Logical connections between the computer 901 and a remote computing device 914a,b,c can be made via a local area network (LAN) and a general wide area network (WAN). Such network connections can be implemented through a network adapter 908. A network adapter 908 can be implemented in both wired and wireless environments. Such networking environments can be conventional and commonplace in offices, enterprise-wide computer networks, intranets. The networking environments generally can be embodied in wireline networks or wireless networks (e.g., cellular networks, such as Third Generation (3G) and Fourth Generation (4G) cellular networks, facility-based networks (femtocell, picocell, Wi-Fi networks, etc.). A group of one or more network(s) 915 can provide such networking environments. In one scenario, the remote computing devices 914a,b,c can embody additional sensor(s), such as inertial guidance system(s). In another scenario, equipment 916 can comprise various part of the exemplary system illustrated in FIG. 1. In yet another scenario(s), equipment 916 also can comprise an inertial guidance system.
[0084] As an illustration, application programs and other executable program components such as the operating system 905 are illustrated herein as discrete blocks, although it is recognized that such programs and components reside at various times in different storage components of the computing device 901, and are executed by the data processor(s) of the computer. An implementation of feature detection software 906 can be stored on or transmitted across some form of computer readable media. Any of the disclosed methods can be performed by computer readable instructions embodied on computer readable media. Computer readable media can be any available media that can be accessed by a computer. By way of example and not meant to be limiting, computer-readable media can comprise "computer storage media," or "computer-readable storage media," and "communications media." "Computer storage media" comprise volatile and non-volatile, removable and non-removable media implemented in any methods or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. Exemplary computer storage media comprises, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. In various embodiments, the systems and methods of the subject disclosure for management and recovery of a monetary instrument can employ artificial intelligence (AI) techniques such as machine learning and iterative learning. Examples of such techniques include, but are not limited to, expert systems, case based reasoning, Bayesian networks, behavior based AI, neural networks, fuzzy systems, evolutionary computation (e.g., genetic algorithms), swarm intelligence (e.g., ant algorithms), and hybrid intelligent systems (e.g., Expert inference rules generated through a neural network or production rules from statistical learning).
[0085] While the systems, devices, apparatuses, protocols, processes, and methods have been described in connection with exemplary embodiments and specific illustrations, it is not intended that the scope be limited to the particular embodiments set forth, as the embodiments herein are intended in all respects to be illustrative rather than restrictive.
[0086] Unless otherwise expressly stated, it is in no way intended that any protocol, procedure, process, or method set forth herein be construed as requiring that its acts or steps be performed in a specific order. Accordingly, in the subject specification, where description of a process or method does not actually recite an order to be followed by its acts or steps or it is not otherwise specifically recited in the claims or descriptions of the subject disclosure that the steps are to be limited to a specific order, it is no way intended that an order be inferred, in any respect. This holds for any possible non-express basis for interpretation, including: matters of logic with respect to arrangement of steps or operational flow; plain meaning derived from grammatical organization or punctuation; the number or type of embodiments described in the specification or annexed drawings, or the like.
[0087] It will be apparent to those skilled in the art that various modifications and variations can be made in the subject disclosure without departing from the scope or spirit of the subject disclosure. Other embodiments of the subject disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the subject disclosure as disclosed herein. It is intended that the specification and examples be considered as non-limiting illustrations only, with a true scope and spirit of the subject disclosure being indicated by the following claims.

Claims

1. A method for feature detection, comprising:
receiving radiation data based on radiation backscattered from a surface;
identifying, in the radiation data, a feature of the surface;
receiving optical imaging data associated with the surface;
identifying a portion of the optical imaging data comprising the feature; and associating a geographic location with the feature.
2. The method of claim 1, further comprising providing radiation to the surface, at least a portion of the radiation yielding the radiation backscattered from the surface in response to interacting with the surface.
3. The method of claim 1 , wherein the surface comprises a surface of at least one of a road, a parking lot, a driveway, a rooftop, an aircraft runway, a bridge, and a sidewalk.
4. The method of claim 1 , wherein the feature comprises at least one of a crack, a hole, and an indentation in the surface.
5. The method of claim 1 , wherein receiving the optical imaging data comprises
receiving data from a light detection and ranging device, and wherein data from the light detection and ranging device comprises distance of the feature from the light detection and ranging device and spatial characteristics of the feature.
6. The method of claim 1 , wherein identifying in the radiation data at least one feature of the surface comprises identifying from the radiation data a signal strength value above a threshold value, and wherein the signal strength value indicates the amount of radiation backscattered from one or more locations on the surface.
The method of claim 6, wherein identifying in the radiation data at least one feature of the surface comprises identifying a pattern of the signal strength values over time.
8. The method of claim 1 , wherein associating the geographic location with the feature comprises receiving geographic data from at least one of a geographic positioning system device and an inertial reference system device.
9. The method of claim 1 , further comprising providing a representation of the feature based on the optical imaging data and the associated geographic location.
10. The method of claim 9, wherein the representation of the feature is a three
dimensional computer generated model.
1 1. A system for feature detection, comprising:
a radiation detection device configured to receive radiation backscattered from a surface;
an optical detection device configured to receive optical imaging data associated with the surface; and
a computing device configured to,
receive radiation data from the radiation detection device, wherein the radiation data is based on the radiation backscattered from the surface, identify in the radiation data a feature of the surface,
receive the optical imaging data associated with the surface from the optical detection device,
identify a portion of the optical imaging data comprising the feature, and associate a geographic location with the feature.
12. The system of claim 1 1, wherein the radiation detection device is configured to provide radiation to the surface, at least a portion of the radiation yielding the radiation backscattered from the surface in response to interacting with the surface.
13. The system of claim 11, wherein the surface comprises a surface of at least one of a road, a parking lot, a driveway, a rooftop, an aircraft runway, a bridge, and a sidewalk.
14. The system of claim 11, wherein the feature comprises at least one of a crack, a hole, and an indentation in the surface.
15. The system of claim 11 , wherein the optical detection device comprises a light
detection and ranging device configured to detect distance of the feature from the light detection and ranging device and spatial characteristics of the feature, and wherein the computing device being configured to receive the optical imaging data comprises the computing device being configured to receive distance of the feature from the light detection and ranging device and spatial characteristics of the feature from the light detection and ranging device.
16. The system of claim 11 , wherein the computing device being configured to identify in the radiation data at least one feature of the surface comprises the computing device being configured to identify from the radiation data a signal strength value above a threshold value, and wherein the signal strength value indicates the amount of radiation backscattered from one or more locations on the surface.
17. The system of claim 16, wherein the computing device being configured to identify in the radiation data at least one feature of the surface comprises the computing device being configured to identify a pattern of the signal strength values over time.
18. The system of claim 11 , further comprising at least one of a geographic positioning system device and an inertia] reference system device, wherein the computing device being configured to associate the geographic location with the feature comprises the computing device being configured to receive geographic data from at least one of the geographic positioning system device and the inertial reference system device.
19. The system of claim 1 1 , wherein the computing device is further configured to
provide a representation of the feature based on the optical imaging data and the associated geographic location.
0. The method of claim 19, wherein the representation of the feature is a three dimensional computer generated model.
PCT/US2013/022556 2012-01-20 2013-01-22 Surface feature detection by radiation analysis WO2013110072A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261589247P 2012-01-20 2012-01-20
US61/589,247 2012-01-20

Publications (1)

Publication Number Publication Date
WO2013110072A1 true WO2013110072A1 (en) 2013-07-25

Family

ID=48799750

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/022556 WO2013110072A1 (en) 2012-01-20 2013-01-22 Surface feature detection by radiation analysis

Country Status (1)

Country Link
WO (1) WO2013110072A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015215268A (en) * 2014-05-12 2015-12-03 清水建設株式会社 Method and device for simply and quantitatively evaluating quality of concrete surface layer
US9802599B2 (en) 2016-03-08 2017-10-31 Ford Global Technologies, Llc Vehicle lane placement
US10115165B2 (en) 2012-08-22 2018-10-30 University Of Alaska Fairbanks Management of tax information based on topographical information
WO2020061289A1 (en) * 2018-09-20 2020-03-26 Nucor Corporation Online monitoring and control to eliminate surface defects arising during the production of cast steel strip
CN114821333A (en) * 2022-05-16 2022-07-29 中国人民解放军61540部队 High-resolution remote sensing image road material identification method and device
US11433903B2 (en) * 2019-05-20 2022-09-06 Universidad Carlos Iii De Madrid Road condition sensor and method for detecting the state condition of the roadway

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7454047B2 (en) * 2004-04-20 2008-11-18 Bio-Rad Laboratories, Inc. Imaging method and apparatus
US20100085175A1 (en) * 2003-03-14 2010-04-08 Liwas Aps Device for detection of surface condition data
WO2011116375A1 (en) * 2010-03-19 2011-09-22 Northeastern University Roaming mobile sensor platform for collecting geo-referenced data and creating thematic maps
US20130046471A1 (en) * 2011-08-18 2013-02-21 Harris Corporation Systems and methods for detecting cracks in terrain surfaces using mobile lidar data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100085175A1 (en) * 2003-03-14 2010-04-08 Liwas Aps Device for detection of surface condition data
US7454047B2 (en) * 2004-04-20 2008-11-18 Bio-Rad Laboratories, Inc. Imaging method and apparatus
WO2011116375A1 (en) * 2010-03-19 2011-09-22 Northeastern University Roaming mobile sensor platform for collecting geo-referenced data and creating thematic maps
US20130046471A1 (en) * 2011-08-18 2013-02-21 Harris Corporation Systems and methods for detecting cracks in terrain surfaces using mobile lidar data

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10115165B2 (en) 2012-08-22 2018-10-30 University Of Alaska Fairbanks Management of tax information based on topographical information
JP2015215268A (en) * 2014-05-12 2015-12-03 清水建設株式会社 Method and device for simply and quantitatively evaluating quality of concrete surface layer
US9802599B2 (en) 2016-03-08 2017-10-31 Ford Global Technologies, Llc Vehicle lane placement
WO2020061289A1 (en) * 2018-09-20 2020-03-26 Nucor Corporation Online monitoring and control to eliminate surface defects arising during the production of cast steel strip
US11433903B2 (en) * 2019-05-20 2022-09-06 Universidad Carlos Iii De Madrid Road condition sensor and method for detecting the state condition of the roadway
CN114821333A (en) * 2022-05-16 2022-07-29 中国人民解放军61540部队 High-resolution remote sensing image road material identification method and device

Similar Documents

Publication Publication Date Title
Gargoum et al. Automated extraction of road features using LiDAR data: A review of LiDAR applications in transportation
Guan et al. Use of mobile LiDAR in road information inventory: A review
Wang et al. Computational methods of acquisition and processing of 3D point cloud data for construction applications
Che et al. Object recognition, segmentation, and classification of mobile laser scanning point clouds: A state of the art review
Gargoum et al. Automated highway sign extraction using lidar data
Yang et al. 3D local feature BKD to extract road information from mobile laser scanning point clouds
Guan et al. Automated road information extraction from mobile laser scanning data
Guan et al. Using mobile LiDAR data for rapidly updating road markings
Yang et al. Semi-automated extraction and delineation of 3D roads of street scene from mobile laser scanning point clouds
Díaz-Vilariño et al. Automatic classification of urban pavements using mobile LiDAR data and roughness descriptors
Gargoum et al. A literature synthesis of LiDAR applications in transportation: Feature extraction and geometric assessments of highways
Yu et al. Automated detection of urban road manhole covers using mobile laser scanning data
WO2013110072A1 (en) Surface feature detection by radiation analysis
Singh et al. A robust approach to identify roof bolts in 3D point cloud data captured from a mobile laser scanner
Liu et al. Ground surface filtering of 3D point clouds based on hybrid regression technique
CN107944377B (en) Traffic infrastructure safety monitoring method and system
Jaboyedoff et al. Landslide analysis using laser scanners
Yadav et al. Road surface detection from mobile lidar data
Soilán et al. Road marking degradation analysis using 3D point cloud data acquired with a low-cost Mobile Mapping System
Che et al. Efficient segment-based ground filtering and adaptive road detection from mobile light detection and ranging (LiDAR) data
Guerrieri et al. Flexible and stone pavements distress detection and measurement by deep learning and low-cost detection devices
Pradhan et al. Laser scanning systems in highway and safety assessment
Miyazaki et al. Line-based planar structure extraction from a point cloud with an anisotropic distribution
Wu Data processing algorithms and applications of LiDAR-enhanced connected infrastructure sensing
Rahman Uses and Challenges of Collecting LiDAR Data from a Growing Autonomous Vehicle Fleet: Implications for Infrastructure Planning and Inspection Practices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13738136

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13738136

Country of ref document: EP

Kind code of ref document: A1