CN102472664A - Multi-spectral imaging - Google Patents

Multi-spectral imaging Download PDF

Info

Publication number
CN102472664A
CN102472664A CN2010800355653A CN201080035565A CN102472664A CN 102472664 A CN102472664 A CN 102472664A CN 2010800355653 A CN2010800355653 A CN 2010800355653A CN 201080035565 A CN201080035565 A CN 201080035565A CN 102472664 A CN102472664 A CN 102472664A
Authority
CN
China
Prior art keywords
light
image
multispectral
plane
hole
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2010800355653A
Other languages
Chinese (zh)
Inventor
R.夫卢特斯
R.T.J.穆伊杰斯
H.A.W.施梅茨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Publication of CN102472664A publication Critical patent/CN102472664A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0229Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using masks, aperture plates, spatial light modulators or spatial filters, e.g. reflective filters
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • H04N23/16Optical arrangements associated therewith, e.g. for beam-splitting or for colour correction

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Spectrometry And Color Measurement (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)

Abstract

A multi-spectral camera comprises a blocking element (201) having at least one hole (203) allowing light to pass through. A dispersive element (205) spreads light from the at least one hole (203) in different wavelength dependent directions and a lens (207) focuses light from the dispersive element (205) on an image plane (209). A microlens array (211) receives light from the lens (207) and an image sensor (213) receives the light from the microlens array (211) and generates a pixel value signal which comprises incident light values for the pixels of the image sensor (213). A processor then generates a multi-spectral image from the pixel value signal. The approach may allow a single instantaneous sensor measurement to provide a multi-spectral image comprising at least one spatial dimension and one spectral dimension. The multi-spectral image may be generated by post-processing of the sensor output and no physical filtering or moving parts are necessary.

Description

Multispectral imaging
Technical field
The present invention relates to multispectral imaging, and relate to the multispectral image that generation comprises two spaces and a spectrum dimension especially and not exclusively.
Background technology
Human eye is characterised in that three type the cone responsive to the different piece of visible spectrum.These cones are marked as L, M and the S of the wavelength (long wave, medium wave and shortwave, it is roughly corresponding to red, green and blue) that refers to its sensing usually.The relative spectral susceptibility of the cone has been shown in Fig. 1, has the figure shows that the said cone has suitable broadband character and corresponding to the integration of light in the broad wavelength coverage.Therefore, might make two kinds of materials for human viewer, show as and under the specific light condition, have identical color with different spectral signatures.This phenomenon is known as metamerism (metamerism).Similar with human eye, adopt three primary colors (RGB) system of wide look light filter to become the main flow of display and video camera.Display depends on the suitable mixing of primary colors so that generate by any color in the colour gamut that primary colors surrounded.
Usually useful situation is the token image of coming up on the basis of the more detailed spectral reflectance factor that is provided than (coarse relatively) RGB color coordinates.The image with local light spectrum information is caught in expectation in addition, that is to say therein the spectral characteristic of the different piece of token image separately.Such imaging just is known as multispectral imaging, and is a kind of technology that has been found that many practical applications, for example comprises:
-pollutant detects
-environmental monitoring
-cereal/the sorting of wood
-microorganism detection (fluorescence/blood count)
The metering of-fluidic cell
-blood oxygen is quantitative or the like.
Use for some, hope to analyze the only specific part of visible spectrum.For instance, in photoplethysmography, derive human heart rate from the time series analysis of optical recording.But what clearly confirmed is, because the spectral absorption attribute of haemoglobin, heart rate signal is the strongest for green (for example 540-560nm).Consequently, compare with the system of the broadband sensor of the signal of the more nonspecific property of employing around picking up and noise, the system of the interested narrow band of ad hoc analysis will provide estimation more accurately.
Hope that multispectral camera provides high spatial resolution, high spectral resolution and high time resolution simultaneously.But these require conflict often, so trading off between these demands of different usually is necessary.
One type multispectral camera uses a kind of like this method, wherein scans scene/target line by line, and disperses the spectrum that element (such as grating or prism) extracts each pixel in this row with this row quadrature ground use.Utilize the traditional two-dimensional sensor to catch resulting 2-D data (having a Spatial Dimension and a spectrum dimension).Subsequently through on perpendicular to the direction of this row, little by little and sequentially scanning each capable complete three-dimensional data (two Spatial Dimensions and a spectrum dimension) of setting up.
But such video camera is relative complex and need machinery to move to implement scanning often.This tends to cause increasing complexity, increases cost, reduces reliability, increases power consumption and increases size and/or weight.Required scan process is also often slow relatively, thereby causes the relatively long time of needs to catch image.This makes said method be not suitable for for example catching mobile image.
The multispectral camera of another kind of type uses the variable spectral filters that is placed on common black-and-white photography machine the place ahead.Through sequentially changing said light filter and writing down corresponding image, can gather complete three-dimensional data (that is to say each image of catching will corresponding to the light in the band connection frequency interval of light filter).A major defect of this method is that owing to many light are blocked by light filter, so optical efficiency seems quite low.In addition, the suitable light filter such as liquid crystal tunable light filter and the humorous light filter of acoustics-optical tunable is quite expensive, and only allows the light of single wavelength to pass through usually (falling into logical).Said method also often has and the identical shortcoming of scanning multispectral camera, and promptly slow, reliability is relatively low or the like.
The shortcoming of a particular importance of the multispectral camera of these types is that its sacrificial light spectral resolution is to obtain temporal resolution.This is disadvantageous under the situation that the object that is formed images is moving.In addition, said method has the spectral resolution of very specific (fixing) usually, and it can't easily be adapted to application.
Therefore, improved multispectral camera will be favourable.For instance, allow to improve dirigibility, reduce cost, reduce complexity, improve reliability, reduce size/weight, reduce power consumption, improve time performance/resolution and/or improve performance multispectral camera will be favourable.
Summary of the invention
Correspondingly, the present invention attempts preferably alleviating, alleviate or eliminate above-mentioned one or more shortcomings separately or with combination in any.
According to an aspect of the present invention, a kind of multispectral camera is provided, it comprises: have the disconnected element of photoresistance that at least one allows the hole that light passes; Be used for the element of dispersing that disperses from the light of said at least one hole depending on the different directions of wavelength; Be used for the lens that focus on from the said light of dispersing element on the picture plane; Reception is from the microlens array of the light of said lens; Imageing sensor, it receives from the light of said microlens array and generation and comprises the pixel value signal corresponding to the incident light numerical value of each pixel of said imageing sensor; And the processor that is used for generating multispectral image from said pixel value signal.
The present invention can provide a kind of improved spectroscopic camera.Specifically, said method can allow to catch multispectral image and need be such as the sequential operation scanning or the order of physics light filter change.Catching of image information for allowing the generation multispectral image can be instantaneous basically in many examples.Said method can provide improved time performance, and can be especially when keeping higher relatively space and spectral resolution the permission high time resolution.Multispectral camera of the present invention can be particularly suitable for for example catching object or the video image that is moving.
Said method can also allow the fixed mechanical setting, and can reduce cost, reduce size/weight, reduction power consumption and/or reduction complexity.It can also provide the reliability of increase.
For instance, in certain embodiments, utilize the multispectral camera of said method can catch spectrum corresponding to each pixel in the scene instantaneously.Opposite with traditional photography machine (for example line scanning spectrometer or variable beam cut filter video camera); Can catch local light spectrum information simultaneously corresponding to all pixels; Thereby improved time performance is provided, and this is very useful under the situation that for example has motion.
Said system can use from conventional image sensor and main lens, microlens array and the combined data of specialized configuration of dispersing element (such as grating or prism), so that generate detailed multispectral image.Said method can be carried out aftertreatment to the signal from imageing sensor, so that generate the multispectral image with desired requirement.For instance, said method can allow in software, chromatic filter to be designed and be applied as the numerical value post-processing step, thereby the dirigibility of increase is provided.
Multispectral image has space and spectral content simultaneously.As a rule, concentrate representative data in three-dimensional data corresponding to two Spatial Dimensions and a spectrum dimension.For instance, can be by the spectral distribution of said multispectral image representative corresponding to a plurality of zones of said image.Therefore, said multispectral image is the space and the spectrum picture of combination.In certain embodiments, can multispectral image be divided into a plurality of pixels, wherein spectral distribution be provided for each pixel.Multispectral image comprises the independent spectroscopic data corresponding to a plurality of zones in the said image.Therefore, multispectral image comprises the spectroscopic data of localization, and the information of visual picture can be provided especially simultaneously and corresponding to the spectrum change of said image.
The disconnected element of said photoresistance can form the encirclement for video camera, thereby makes the said light of dispersing element, lens, lenticule and imageing sensor of arrival just pass the light of the hole in the disconnected element of photoresistance.
The structure of said multispectral camera can use the order planar structure; Wherein blocking element forms the disconnected plane (it separates with described hole) of first photoresistance; Thereafter corresponding to dispersing element (maybe be parallel) plane; Thereafter being the plane of lens, is thereafter (maybe be parallel) plane of microlens array, is thereafter imageing sensor (it maybe be parallel with microlens array).Said can be (virtual) (maybe be parallel) plane as the plane, and it is between lens and the microlens array usually.Lens, lenticule and sensor plane can be configured to the Scheimpflug configuration especially.So for example can allow the embodiment of the certain angle of wherein dispersing element introducing and optical axis.
The said element of dispersing for example can be prism or optical grating element, and can dispersion be provided through diffraction.
Said at least one hole can be any suitable hole in the disconnected element of photoresistance, and it allows light to pass the disconnected element of said photoresistance.It is empty that described hole needs not be, but for example can be filled with transparent material.In certain embodiments, lens and/or diaphragm can be in said at least one hole place.Therefore, from the angle of optics, said at least one hole can be at least one lens and/or diaphragm.Specifically, said at least one hole can comprise and is imaged onto said aperture of dispersing the object lens on the element.
According to an optional feature of the present invention, multispectral image comprises the spectral distribution indication corresponding to each pixel of said multispectral image.
The present invention can allow to improve with higher relatively space, spectrum and/or temporal resolution the generation of multispectral image.
According to an optional feature of the present invention, said processor is configured to: from first image on the synthetic rainbow plane of pixel value signal; Through the first image applications spatial mask is generated second image, said spatial mask is corresponding to spectral characteristic; And generate spatial image from second image corresponding to said spectral characteristic for said multispectral image.
Improved performance can be provided like this and/or promote operation.Specifically, it can allow to confirm to satisfy corresponding to the concrete expectation of indivedual embodiment and the spectral characteristic of requirement.For example can confirm said spectral characteristic and change through computing without any need for manual, machinery or physics.Therefore, can change the characteristic of the multispectral image that is generated simply through the processing of adaptive sensor output.Thereby realized a kind of method more flexibly, it does not receive the limitation of the physical restriction of spectrum for example or spatial light filter.
Particularly such plane, said rainbow plane, wherein light wavelength is only depended in the position of light.Therefore, the light of all parts of scene (promptly passing said at least one hole at any angle) will be assembled and depend on the same point of wavelength.Therefore, at place, said rainbow plane, spectral characteristic is converted to spatial character fully.
First image is not the spatial image of scene, but spectrum picture, wherein every bit is corresponding to the accumulation light intensity corresponding to a wavelength of said scene.Therefore, first image can be regarded as spectrum picture or spectrum map.Specifically, first image can be regarded as the spectral intensity map.
According to an optional feature of the present invention, said processor is configured to: through to the corresponding different spaces mask of first image applications, confirm a plurality of spatial images corresponding to different spectral characteristics; And from said a plurality of spatial images generation multispectral images.
This can provide a kind of be used to generate the actual of multispectral image and method flexibly.Can use a succession of mask concurrently or sequentially by post-processing algorithm, so that the spatial image set corresponding to different spectrum footprint/characteristics is provided corresponding to different spectral characteristic/light filters.Can select and use said spectrum footprint/characteristic neatly through the spatial manipulation of low complex degree.
According to an optional feature of the present invention, said spectral characteristic is corresponding to BPF..
Said pass band filter characteristic can be especially corresponding to the selection of frequency separation.This can allow reality and generate multispectral image efficiently.
According to an optional feature of the present invention; Microlens array and imageing sensor are configured such that at the light that passes said at least one hole under the identical angle and are distributed on a plurality of pixels of imageing sensor that said distribution is the distribution of depending on wavelength.
This can improve and/or promote to generate multispectral image.Said distribution can make the single ray of under given angle, passing said at least one hole can arrive a plurality of pixels especially, wherein interval each pixel that arrives of different wavelengths.
According to an optional feature of the present invention, said processor is configured to compensate the single pixel value of the imageing sensor that receives light, and wherein said light passes said at least one hole corresponding to the light with different wave length and under different angles.
This can improve and/or promote the generation of multispectral image.Alternatively or additionally, said method can promote to realize.
According to an optional feature of the present invention, said microlens array is located substantially on place, picture plane.
This possibly be particularly advantageous in some embodiment or situation.Specifically, it can allow directly to generate multispectral image and the aftertreatment that need not filter from sensor output in many situations.Said method can for example provide the spectral resolution of increase.
According to an optional feature of the present invention, said microlens array is between picture plane and imageing sensor.
This possibly be particularly advantageous in some embodiment or situation.Specifically, it can allow imageing sensor to catch and be very suitable for the information through calculating aftertreatment generation multispectral image.
According to an optional feature of the present invention, said multispectral camera comprises that also the user imports and controller, and this controller is used for importing the position of one of them at least of regulating microlens array and imageing sensor in response to the user.
This can allow multispectral camera more flexibly, and can allow the user to control compromise corresponding between the space of the image of being caught by said imageing sensor and the spectral resolution especially.
According to an optional feature of the present invention, the disconnected element of said photoresistance provides the photoresistance plane of break, and said at least one hole is the said photoresistance slit in the plane of breaking.
Said method can allow to catch the 3-D view with two Spatial Dimensions and a spectrum dimension from the single sensor measurement.Said slit can have 1mm or littler width usually.In addition, narrow slit can guarantee that incident ray receives excellent control in the angle on said at least one hole on a dimension, allow to catch the scene of expansion simultaneously.The said element of dispersing for example can be the line raster element, and it has each row that is arranged essentially parallel to said slit.Said microlens array can specifically have the lenticular lens array of each row that is arranged essentially parallel to said slit.
According to an optional feature of the present invention, said imageing sensor is a two-dimensional image sensor.
This can allow to catch the 3-D view with two Spatial Dimensions and a spectrum dimension from the single sensor measurement.
According to an optional feature of the present invention, said at least one hole comprises and forms a plurality of holes of coded aperture.
This can improve the light sensitivity of multispectral camera, still allows to generate multispectral image efficiently through aftertreatment simultaneously.Said coded aperture can provide the incident light from a plurality of holes especially, allows simultaneously through aftertreatment this to be compensated.Said coded aperture can for example comprise the rounded basically hole that is set in the appropriate configurations or elongated slit, and it can compensate through aftertreatment/reverse.
According to an optional feature of the present invention, said at least one hole comprises pin hole.
This can allow to control well the angle of incident ray on said at least one hole through the clearly angle of definition according to the direction of going to the light source.Advantageously, said pin hole can have 1mm or littler largest.
According to an aspect of the present invention, a kind of method that generates multispectral image is provided, said method comprises: provide to have the disconnected element of photoresistance that at least one allows the hole that light passes; Be provided for the element of dispersing that disperses from the light of said at least one hole depending on the different directions of wavelength; Be provided for the lens that focus on from the said light of dispersing element on the picture plane; The microlens array of reception from the light of said lens is provided; Imageing sensor is provided, and it is used to receive from the light of said microlens array and generation and comprises the pixel value signal corresponding to the incident light numerical value of each pixel of said imageing sensor; And from said pixel value signal generation multispectral image.
With reference to (a plurality of) embodiment of describing below, of the present invention these will become obvious with other aspects, feature and advantage and with described.
Description of drawings
Below will be only by way of example mode with reference to accompanying drawing embodiments of the invention are described, wherein:
Fig. 1 is the diagram of the relative spectral susceptibility of human eye;
Fig. 2 is the diagram according to some elements of the multispectral camera of some embodiments of the present invention;
Fig. 3 is the diagram according to some elements of the multispectral camera of some embodiments of the present invention;
Fig. 4 is the diagram according to some elements of the dexterous spectrum imaging system of prior art.
Fig. 5 is the diagram according to some elements of the treatment element that is used for multispectral camera of some embodiments of the present invention.
Fig. 6 is the diagram according to some elements of the multispectral camera of some embodiments of the present invention.
Fig. 7 is the diagram according to some elements of the multispectral camera of some embodiments of the present invention.
Embodiment
Fig. 2 shows an instance according to each element of the multispectral camera of some embodiments of the present invention.
Said multispectral camera comprises the photoresistance element 201 that breaks, and it comprises of allowing that light passes or multiple hole 203 more.For the sake of clarity, following description will concentrate on an instance, wherein in the disconnected element 201 of photoresistance, pin hole (or narrow gap) will be provided, but will be appreciated that, in other embodiments, can comprise more than a hole.
In this example, hole 203 has the maximum specification (then being the width less than 1mm for the slit perhaps) less than 1mm.Specifically; Hole 203 is little of making the variation of direction/angle on described hole from the light of each object that is formed images be no more than for example 1 °; That is to say that the light that is derived from same position only just can pass described hole when the angle of its relative hole 203 is in 1 ° of scope each other.In this instantiation, said multispectral camera be used to hole 203 distance at least the object of 20cm be carried out to picture, so described hole is enough little, thereby makes the angle/direction from the light of same space point of passing hole 203 be no more than 1 °.
Said multispectral camera also comprises disperses element 205, and the light of hole 203 is passed in its reception.In this example, disperse element 205 and form dispersive plane.Disperse element 205 and depending on the light that disperses on the different directions of wavelength from hole 203.Therefore, light possibly be derived from a certain object in the scene and pass hole 203 and arrive distribution server 109.Since hole 203 than small dimension, direction/angle that light arrives described hole only depends on from said object to hole 203 direction (supposition hole 203 has infinitesimal size).So disperse element 205 said light is distributed in the certain angle distribution, wherein comes the shooting angle of self defocusing element 205 to depend on wavelength.
The size that should be mentioned that described hole directly determines obtainable spectral resolution.Impinging upon the angular range of dispersing the same position on the element 205 is given and is made said pinhole size divided by hole 203 and disperse the distance between the element 205.So just controlled the direction of different light rays after dispersing, thereby controlled the spectral resolution of for example locating on said rainbow plane.
In certain embodiments, dispersing element 205 for example can be prism.In the instance of Fig. 2, dispersing element 205 is optical grating elements of dispersing incident light owing to diffraction effect.Therefore, in this example, propagate needle passing hole 203 from the light of said scene (object X, Y, Z), and drop on subsequently on the grating (it provides by dispersing element 205).Because from said grating diffration action, different wavelengths is diffused on the different directions of propagation.
Depend on desired visual field, hole 203 and the distance of dispersing between the element 205 possibly usually advantageously be in interval interior (comprising this two numerical value) between 10 to 100mm.
Said multispectral camera also comprises lens 207, and it receives the light of self defocusing element 205 and focuses it on the picture plane 209.Said focusing makes all light that under given angle, pass hole 203 arrive as the same point on the plane 209.Therefore, when measuring at picture 209 places, plane, said lens replenish/reverse the operation/effect of dispersing element 205.Therefore, the dispersion of dispersing 205 pairs of light of element is accurately compensated to picture plane 209 by lens 207, thereby makes single ray converged to a single point place on the picture plane 209.Therefore, except the reversing (being that image " turns upside down "), as the incident light on the plane 209 corresponding to the incident light of dispersing on the element 205.
Should be mentioned that as plane 209 be not physical component, but the plane that the spectrum dispersion of light is compensated is wherein imported in reference into.Therefore, if imageing sensor is positioned at place, picture plane, then it does not catch any spectral information with the capture space image.Can be regarded as corresponding to virtual plane (can catch the spatial image of focusing therein) as plane 209.
Lens 207 are oriented to have the main shaft/plane perpendicular to the N that disperses element 205 (normally first) order diffraction usually.In addition, disperse distance between element 205 and the lens 207 will be usually advantageously greater than hole 203 and disperse the distance between the element 205.
Said multispectral camera also comprises microlens array 211, and it receives the light from lens 207.Microlens array 211 comprises a plurality of lens of covering from the incident light plane of lens 207.Microlens array 211 can advantageously generate the plane crossing with " hinge line " (Scheimpflug configuration), said hinge line also with the main shaft/Plane intersects of plane of dispersing element 205 and lens 207.Distance between lens 207 and the microlens array 211 can be from confirming corresponding to the lens formula of lens 207, thus depend on lens 207 intensity and with the distance of dispersing element 205.
Said multispectral camera also comprises imageing sensor 213, and imageing sensor 213 comprises a plurality of sensor elements that are used to detect the incident lighting level.Therefore, each sensor element is the optical sensor corresponding to the pixel of captured images.In the instance of Fig. 2, said sensor element is set in the two dimensional surface.
Therefore, each sensor element can be corresponding to the pixel of the image that is generated at place, imageing sensor 213 residing plane by microlens array 211.Imageing sensor 213 generations comprise the pixel value signal corresponding to the incident light numerical value of each pixel of said imageing sensor.Said pixel value signal can comprise the measurement numerical value corresponding to each sensor element especially.
Said pixel value signal is fed to processor, and said processor is confirmed multispectral image from said signal subsequently.
The introducing of microlens array 211 provides the information of in specific camera configuration, the information of being caught being carried out aftertreatment of can being used in fact, thereby can generate three-dimensional (two Spatial Dimensions and a spectrum dimension) data set corresponding to multispectral image through the single transient measurement.Therefore, can need not carry out confirming spectral information under optical filtering of order physics or the scan condition.
The introducing of microlens array 211 allows to measure exactly and definite individually space and spectral characteristic based on the single two-dimensional image sensor especially.
An instance of this respect has been shown in Fig. 3.In this situation, microlens array 211 is located substantially on picture 207 places, plane.Therefore, all light that are incident on the microlens array 211 all spatially focus on well, and each lenticule can be regarded as corresponding to the space pixel value.Therefore but the image that is projected on the microlens array 211 does not have any spectral distribution, that is to say, all arrives identical lenticule from all wavelengths of the same position in the scene (and under equal angular, pass hole 203).Although but the different wave length of light converges to the same point in the microlens array 211, said wavelength be assemble from different direction and have a different incident angle.This point is used to according to importing angle into and therefore disperseing incident ray according to wavelength by microlens array 211.Therefore, from the light of lenticule outgoing corresponding to the incident light on this lenticule (and therefore corresponding to single position), but the certain angle with reflection wavelength disperses, and that is to say that emergent light has spectrum (space) and distributes.
In the instance of Fig. 3; Imageing sensor 213 is positioned such that from a lenticular light and covers a plurality of pixels; Meanwhile; Only arrive a collection of pixels (otherwise perhaps, overlapping from not having between each lenticular light cone, and each pixel (sensor element) receives only from a lenticular light) from each lenticular light.
Correspondingly, in the instance of Fig. 3, imageing sensor 213 is caught the light that is divided into space pixel group, and wherein each pixel group is corresponding to a lenticule.In addition, each pixel group comprises a plurality of pixels, and wherein each pixel is corresponding to the interval light of specific wavelength.Therefore, the data of being caught provide the multispectral data corresponding to multispectral image, and it has corresponding to each lenticular spatial resolution of microlens array 211 and corresponding to the spectral resolution of the pixel count in each pixel group.
As a practical examples, can with one 100 take advantage of 100 microlens arrays 211 to use the 1M element sensor to provide to have 100 take advantage of spatial resolution and 100 spectrum numerical value of each space pixel of 100 pixels the multispectral image of spectral resolution.
Therefore, microlens array 211 utilizes the information of the angle that depends on wavelength of the incident ray that goes to microlens array 211 that spectral information is provided, and keeps the different spaces information as the plane simultaneously.Therefore, caught simultaneously and distinguishable spectrum and spatial information through the single sensor measurement.
Should be mentioned that and under the situation of not introducing microlens array 211, to carry out such measurement.For instance; At Mohan, A., Raskar, R. and Tumblin; J. article " Agile Spectrum Imaging:Programmable Wavelength Modulation for Cameras and Projectors (dexterous light spectrum image-forming: be used for the programmable wavelength modulation of video camera and projector) " (Computer Graphics Forum; Vol. 27, number 2,2008; The 709-717 page or leaf) propose in, multispectral camera can be based on the structure among Fig. 4 for example.In this system, imageing sensor is positioned at picture plane place, this position in said system only, and the locus in light source is just only depended in the position of light.Therefore, in prior art systems, this plane is the only plane that allows to confirm the locus.
The prior art video camera is carried out optical filtering so that generate multispectral image to importing light into.Specifically, the prior art video camera sequentially inserts different light filters at place, rainbow plane.The rainbow plane is such plane, and wherein light wavelength is only depended in the position of light, and does not depend on the source, space (and therefore not depending on that light passes the angle/direction of hole 203) of light.Therefore, at place, rainbow plane, light is sorted based on its specific wavelength from top to bottom.The rainbow plane can be perceived as and the picture planar complementary, that is to say, the image at place, rainbow plane is not have spatial information or relevant spectrum picture, then is the spatial image that does not have spectral information at the image as the place, plane.
This is used in the prior art video camera so that imageing sensor is positioned at as the place, plane and with component of the light filter and is positioned at place, rainbow plane.Specifically, sequentially insert a series of blocking-up or decay mask at rainbow plane place, and for each mask by imageing sensor capture space image.Therefore each image corresponding to the spectrum footprint in the middle of these images, said spectrum footprint be again corresponding to said mask, and through using a series of masks, can put in order so that multispectral image is provided said image.
But such method often is not optimum, and possibly be considered to slow, complicated and unactual for some application.Specifically, be inconvenient to the requirement that physically changes mask at place, rainbow plane, and cause relatively slow operation and lower temporal resolution usually.
Should be mentioned that in the system of Fig. 4 the image that can catch at any other place, plane will be the combination of space and spectral information.Therefore, the light of specified point that arrives (except rainbow plane or picture plane) given plane is corresponding to the light of the different wave length that is derived from different directions.Can't handle through the view data that imageing sensor is caught and differentiate this combined information, therefore said video camera requires said imageing sensor to be positioned at place, picture plane and to require and sequentially introduces the blocking-up mask at place, rainbow plane.
But the multispectral camera of Fig. 2 and 3 has used diverse ways, and this method need not introduced any light filter of sheltering at place, rainbow plane, and allows the dirigibility aspect the location of imageing sensor 211.
Specifically, said system uses microlens array 211 that information is provided, thereby said information allows to extract spectrum and spatial information simultaneously from single plane and through the single measurement of single image sensor.Specifically, the present invention is based on following understanding: in each plane, through the angle of incident light on this plane additional information is provided, and this information can initiatively be made space and the spectral characteristic that is used for separating this place, plane.
For instance, at picture plane place, the position of importing light into is only depended on and is come source position (and depending on the angle of passing pin hole 203 especially) in the scene.But owing to disperse dispersing that element 205 provided, the angle that light drops on the specified point depends on wavelength.Therefore, in Fig. 3, the microlens array 211 that is inserted in as the place, plane uses this angle-dependence to generate corresponding to each lenticular spectral distribution.Therefore, corresponding to the spectral characteristic of each lenticular spectral distribution reflection, and do not comprise any contribution from any other position corresponding to this lenticular exact image zone.Therefore, microlens array 211 guarantees to keep apart, and sensor correspondingly measure spectrum and spatial information.In fact, in this example, avoid sensor to receive and had the combined light from diverse location of different wave length.Therefore just avoided the ambiguity that to differentiate.
Similarly, at place, rainbow plane, wavelength is only depended in the position of each bar light.But said direction/angle depends on the locus in the scene.Therefore; The microlens array 211 at rainbow plane place can generate corresponding to lenticular spatial spectral, that is to say to be distribute corresponding to lenticular each spectrum range span (although this compares with the instance of Fig. 3 unactual and be difficult to more handle usually).
In certain embodiments, microlens array 211 can be in each place, plane, and in said plane, each position is by being derived from the different light rays arrival that diverse location still has different wave length, the lucky compensated position difference of wherein said different wave length.For instance, in many examples, microlens array 211 can be advantageously located at picture rear, plane (it is in the side away from lens 207).In this case, each sensor element of imageing sensor 213 can receive as having different wave length and from the combination of the light of different spatial.But differentiate by this light being made up, thereby allow to generate suitable multispectral image corresponding to the additional information permission that each lenticular a plurality of optical sensor provided.In other words, through in 211 pairs of positions of microlens array intrinsic ambiguity differentiate, thereby further disclose the information in the incident angle that resides in light.
In certain embodiments, can carry out pre-service, have high spatial resolution and the high and multispectral image of spectral resolution flexibly so that provide to signal from imageing sensor 213.Aftertreatment can comprise composograph especially, and it is corresponding to the image that will catch at place, rainbow plane.Subsequently to rainbow image applications spatial filtering (normally each zone shelters or decays).Based on resulting image, subsequently the blended space image and with it as corresponding to the image of spectrum footprint, said spectrum footprint is corresponding to the applied optical filtering of rainbow plane picture.Through using the set of light filter/mask; Can generate image corresponding to different spectrum footprints or characteristic; And can multispectral image be confirmed as three-dimensional (two Spatial Dimensions and a spectrum dimension) image collection, it comprises these two-dimensional space images corresponding to different spectrum footprints.
Said processor is the iteration following steps especially:
1. synthetic as will be by the virtual-sensor data recorded that is arranged in the rainbow plane.
2. the data of being synthesized are used desired numerical aperture/light filter.
3. on the physical sensors plane, synthesize data again through filter light.
4. arrive the light presentation space image of a certain location of pixels through combination (integration).
Can handle through the light trace and carry out the synthetic of different light filters.In fact; Said processing can be similar to the processing that when back focusing is caught in execution, is proposed to full light video camera through use and realize; As at Lumsdaine; A., Georgiev, that kind of being explained in " The Focused Plenoptic Camera (focusing on full light video camera) " (International Conference on Computational Photography, in April, 2009) of T..
Fig. 5 shows an instance of the processor of following this method.Said processor comprises rainbow plane processor 501, and it receives the pixel value signal from picture signal coder.Said pixel value signal comprises the lighting level corresponding to each optical sensor, promptly corresponding to each pixel of picture signal coder.
Rainbow plane processor 501 is with synthetic first image of continued, and it is corresponding to the image that will be received in place, rainbow plane by virtual image sensor.This synthetic can execution by light trace algorithm, said algorithm uses and is incident on the space of the light on the imageing sensor 213 and the image that angle information calculates rainbow element place.This is synthetic can to comprise that consideration arrives the light with different wave length of optical sensor from different perspectives and can correspondingly compensate this.
First image is not the spatial image of scene, but spectrum picture, wherein every bit is corresponding to the accumulation light intensity for a wavelength of said scene.Therefore, first image can be regarded as spectrum picture or spectrum map.Specifically, first image can be regarded as the spectral intensity map.
More particularly, rainbow plane processor 501 can wherein be considered the refraction at place, lenticule plane through getting back to first image that place, rainbow plane is synthesized on rainbow plane 215 to the light trace that impinges upon on each sensor pixel.
Can realize trace through looking towards corresponding lenticular center, so just position and angle are provided for us from pixel coordinate for light.Next; Matrix form (for example textbook " Optics (optics) " (ISBN 0321188780) the 6.2nd joint " Analytical Raytracing (analyzing the light trace) " and the particularly 6.2.1 with reference to E. Hecht saves " Matrix Methods (matrix method) ") according to corresponding to geometrical optics can pass lens and carry out trace towards the rainbow plane.Next, based on the position of said light, can handle corresponding sensor pixel at place, rainbow plane.
Be fed to spatial mask processor 503 to the image that is synthesized subsequently, it is configured to the first image applications spatial mask.Said mask for example can be a binary mask or can be continuous mask that it for example comprises the pad value corresponding to each pixel of first image.For instance, can comprise the predetermined mask corresponding to the scale factor of each pixel to first image applications, this can realize through pixel value being multiply by said scale factor.
Because the rainbow plane comprises wherein each position corresponding to the picture of specific wavelength (and it is independent of spatial character), so the application of spatial mask is corresponding to the filtering of spectrum/frequency field.Therefore, use, can easily generate any desired spectrum footprint of resulting signal through the low complex degree of mask.Said method can be used to provide the low complex degree bandpass filtering.For instance; Interval for given frequency/wavelength; Can be simply through being set to a scale factor 1 and every other scale factor is set to zero confirms that suitable mask (will be appreciated that corresponding to these interval pixels; In most of embodiment, with the next level and smooth said transformation of the suitable window of application examples such as Hanning or Hamming window and so on).
Therefore, spatial mask processor 503 generates corresponding to the interval mask image of special spectrum.This image is fed to image processor 505, and it continues to synthesize the spatial image corresponding to the mask image at place, rainbow plane.Can generate said spatial image (promptly considering the angle and the light intensity of light) through carry out the light trace from the data of mask image.Said light trace for example can be used to confirm the image at place, picture plane, can think through the said pure spatial image of the interval generation of selected CF of sheltering.
More particularly, can synthesize first image corresponding to the light of each spatial image position through integration/addition as plane processor 505.In order to create many band (for example RGB) images, can carry out independent sheltering for each band and form processing with image.The same with the commonness photograph machine, said video camera integration is from the light corresponding to all different directions of each locus/pixel.In our situation, for composograph, we must carry out integration (consider us can with belonging to each sensor pixel as angle in the plane 215 and image position) to dropping on all light in the identical output pixel bin.
In the instance of Fig. 5, the operation of said processor is by controller 507 controls, and said controller is coupled to rainbow plane processor 501, spatial mask processor 503 and image processor 505.When rainbow plane processor 501 receives the image from imageing sensor 213, the image at its place, synthetic rainbow planes of controller 507 controls and will before deliver to spatial mask processor 503.It offers spatial mask processor 503 with continued with first mask, and indication spatial mask processor 503 carries out required processing with image processor 505, so that generate corresponding to the spatial image by the histogram of this first mask representative.When receiving this image, the said image of controller 507 storages, and continue second mask is offered spatial mask processor 503.This second mask corresponding to the first mask different spectrum distribution plan.Spatial mask processor 503 is controlled to second spatial image of generation corresponding to this second spectral distribution graph subsequently with image processor 505.Mask/spectral distribution graph for the desired number of concrete application repeats said process.Subsequently will be in image collection, so that multispectral image is provided corresponding to the resulting two-dimensional space image collection of different spectral distribution graphs.
For instance, the mask that is provided can be corresponding to the spectrum in the interval that is divided into desired number, and therefore said multispectral image can have the spectral resolution corresponding to said mask and a plurality of spatial images of being generated.
Therefore, be substituted in and introduce the physics light filter in the rainbow plane, said method allows to use light filter through the post-processing step that relates to numerical operation.This generates multispectral image with regard to allowing based on instantaneous the catching of single of being undertaken by imageing sensor.Therefore said method is suitable for for example the object in moving being carried out to picture.
The location of microlens array 211 and imageing sensor 213 can be used to provide desired the trading off between the different qualities.In fact, can obtain different compromise between space and the spectral resolution through being positioned at microlens array 211 and imageing sensor 213 different slightly positions.For instance, in the configuration of Fig. 3, each independent optical sensor/pixel to one of image slightly segments of different sample, thereby in wideer more or less wavelength coverage, carry out integration.This just causes than the higher spatial resolution of the configuration of Fig. 2 but lower spectral resolution (for identical imageing sensor 213).
The configuration of Fig. 3 obtains the optimum spectral resolution by the pixel of single lenticule below/sensor element number decision, and also has the minimum space resolution of the magnification decision of imaging system by lenticular number (and size).In fact, in the instance of Fig. 3, each individual pixels/sensor element of single lenticule below is measured the information about the identical image section, but corresponding to different wavelengths.Relative therewith, in the instance of Fig. 2, but each individual pixels/sensor element of single lenticule below is measured about the information of whole spectrum corresponding to the pictures different section.For the location of microlens array 211, by comprising the combination of wavelength and positional information, the i.e. combination of spectrum and spatial information corresponding to given lenticular pixel/information that the sensor element set is caught in other positions.
Consequently, the location of microlens array 211 and imageing sensor 213 is trading off between spectrum and the spatial resolution.In certain embodiments, multispectral camera can also comprise that the user imports, and it can be used to import the position of revising imageing sensor 213 and/or microlens array 211 (and/or lens 207) according to the user.User input for example can be the machinery of the position of one of them element input of directly squinting, and perhaps for example can be the electricity consumer input of the mechanical actuator (for example stepping motor) that is used to control moving image transmitting sensor 213 and/or microlens array 211.Said moving for example can be with respect to picture plane, lens 207, perhaps for example can be corresponding to relatively moving between microlens array 211 and the imageing sensor 213.
Therefore, user's input can be used to multispectral camera is adapted to the concrete property and the preference of several applications.
In many application, realize improved performance and/or operation easily for the microlens array 211 that is between picture plane 209 and the imageing sensor 213.In fact, suitably trading off between spectrum and the spatial resolution so usually can be provided, allow to carry out the aftertreatment of relatively low complexity simultaneously.In other are used, realize improved performance and/or operation easily for the microlens array that is in picture 209 the place aheads, plane (with respect to imageing sensor 213) 211.
Said method can provide the spectrum imaging system of high flexible, wherein can in software, programme chromatic filter and does not need the chromatic filter of physics.This just provides much higher degree of freedom (for example having negative light filter coefficient) aspect filter design.Consequently might design chromatic filter with the spectral response that can't produce through the physics chromatic filter that adopts LC layer, acousto-optic element or chemical solution.
Compare with the line scanning spectrometer, the equipment that is proposed has additional benefit, promptly collects the local light spectrum information corresponding to all pixels in the scene instantaneously, thereby overcomes the complex situations when having motion to exist.Multispectral camera is that cost provides spectral information with (necessarily) spatial resolution, but said video camera can be adapted to for concrete application said optimal balance between the two is provided.
Fig. 2 and 3 provides for the two dimension of multispectral camera and has described, and in certain embodiments, said video camera can comprise single Spatial Dimension and a spectrum dimension.But in many examples, said imageing sensor is a dimension sensor, and said video camera provides two Spatial Dimensions and a spectrum dimension.Specifically, Fig. 2 and 3 can be regarded as sectional view plane, rainbow plane and the picture plane of each element (promptly along) of the structure of extending perpendicular to the plane of each figure.Therefore, pin hole 203 can be narrow slit, and dispersing element 205 for example can be line raster, and main lens 207 can be common (sphere) lens, and microlens array 211 can be a lenticular lens array.
In the instance in front, blocking element 201 is shown as has single hole 201.But this tends to limit the light quantity of being caught, thereby causes the low light sensitivity of video camera.In order to improve the optical efficiency of said system, possibly hope hole diameter enlargement size (being the size of described hole).But will cause spectrum fuzzy like this, this is will cover sizable incident angle scope because drop on the incident light of dispersing on the element 205.
Therefore, in order to improve light sensitivity, said blocking element can comprise and forms a plurality of holes of coded aperture.Said coded aperture can be regarded as a plurality of holes with the known mode that can be inverted.Specifically, said coded aperture can be the delegation's hole with AD HOC.Pattern through selecting to be easy to reverse might increase the light quantity that gets into video camera, thereby improves light sensitivity, and can compensate the opening that is increased simultaneously.In such system, can in the rainbow plane, use desired chromatic filter and before data decoded/reversed, this for example realizes through utilizing inverse filter that data are deconvoluted; For example at rainbow plane place with data-switching to Fourier domain, and with the corresponding coefficient of these fourier coefficients divided by said coded aperture (projection).Inverse fourier transform produces the data of deconvoluting subsequently.
Will be appreciated that, can use lens and/or diaphragm (and under the situation of using the code that to reverse, utilizing coded aperture) to substitute said pin hole.In such embodiment, said lens/diaphragm is designed such that to impinge upon the angular range of dispersing on the element 205 enough little.For instance, can use 80mm lens (80/16=5mm diameter) in certain embodiments with F/16 aperture.
Should be mentioned that as for the system that uses microlens array common, advantageously carry out F number coupling and be mapped to each pixel at single lenticule rear so that guarantee whole projected angle of impact scope.Otherwise the pixel/sensor of imageing sensor 213 may receive and pass a plurality of lenticular light, thereby possibly cause expendable ambiguity.For light spectrum image-forming, lenticular F number should preferably not be F number=(the F/ diameter) of matched lenses, but match spectrum scope (the rainbow plane is to the diameter on lenticular distance/rainbow plane).So just attempt providing do not have overlapping unique optical path corresponding to each pixel.On the other hand, not hoping to have capture ratio can be with the lenticule of the bigger angular range of angular range, because will cause the white space/black on the sensor like this.The accurate configuration of each element of video camera and specification can be selected to concrete optimizing application performance.
The description of front concentrates on wherein Different Plane and the substantially parallel embodiment of element.But will be appreciated that this is not essential, and also can use other configurations in other embodiments.Specifically, dispersing element plane, lens plane, microlens array plane and sensor plane can be set in the Scheimpflug configuration.The instance of such embodiment provides in Fig. 6 and 7.Fig. 6 shows an instance with smooth outside focal plane and inner Scheimpflug configuration, and Fig. 7 shows an instance with outside Scheimpflug focusing and smooth internal configurations.
Will be appreciated that,, in other implementations, can consider more than a dimension though the structure of Fig. 2 and 3 has been considered the light dispersion/propagation in the single dimension (among the figure on/lower direction).For instance, in certain embodiments, import into light spectrum can also perpendicular to shown in disperse on the direction of direction (that is to say similar dispersion can occur in get into and leave shown on the direction of accompanying drawing).
It should also be appreciated that in certain embodiments can use the video camera in moving to generate a plurality of multispectral images, wherein said a plurality of images are used to analysis subsequently.For instance, can when in one camber line of institute's evaluation object, moving, generate a plurality of multispectral images at video camera.Local light spectral property and change wherein can be used to analyze said object subsequently.This for example possibly be suitable for analyzing paint chipping or other materials.
In fact, can be used in many application by the multispectral imaging that described multispectral camera provides.
For instance, it can be used to wherein must satisfy the illumination application that very strict color presents standard.For example can adopt multispectral imaging to detect and the mimics sunlight outward appearance.
As another instance, said method can be used to (part) display and characterize, so that detect and compensation possibly cause the local color unevenness that in LED-backlit or OLED device, occurs owing to aging effect or hot unevenness.
Said method can also be used in the various signs application, detects, paints sign, pollutant detection or the like such as quality of food.It can also be used in the consumer field, and wherein described method is enough compact and mechanically reliable, thereby can be built in the mobile phone.
In addition, described method can be used to several kinds of application in the health care there.For instance, the degree of depth of light transdermal depends on light wavelength.Through produce the image of skin texture as the function of wavelength, can in certain varying depth scope, be carried out to picture to skin.In addition, the reflectance spectrum of skin has very unique characteristic.Can utilize this characteristic to detect the mankind in the scene.This can realize through the simple crosscorrelation of carrying out local spectrum and expected light spectrum signature, thereby human probability map is provided.The human detection of such spectrum should be much more reliable than the common skin-color tone detector based on three broad Color Channels.
Will be appreciated that the description of front has for the sake of clarity been described each embodiment of the present invention with reference to different functional units and processor.But should be appreciated that any appropriate functional distribution that to use under the situation of the present invention between different function units or the processor not deviating from.For instance, being shown as the function of being carried out by processor that separates or controller can be carried out by identical processor or controller.Therefore, reference be should only be regarded as, and strict logical OR physical arrangement or tissue do not shown for the appropriate device that described function is provided for the reference of specific functional units.
The present invention can implement through any suitable form, comprising hardware, software, firmware or its combination in any.The present invention can be alternatively be embodied as at least in part operates in one or the computer software on multidata processor and/or the digital signal processor more.Each element of one embodiment of the present of invention and assembly can implemented through any suitable mode aspect physics, function and the logic.In fact, said function may be implemented within the individual unit, is implemented in a plurality of unit or is embodied as the part of other functional units.Therefore, the present invention may be implemented within the individual unit or can be distributed between different units and the processor at physics and function aspects.
Though described the present invention in conjunction with some embodiment, the concrete form that the present invention should not be limited to here to be set forth.On the contrary, scope of the present invention is only limited appended claims.In addition, though a certain characteristic seem and possibly describe about specific embodiment, those skilled in the art will recognize that, can make up the various features of described each embodiment according to the present invention.In claims, " comprising ", the existence of other elements or step do not got rid of in a speech.
In addition, though listed separately, multiple arrangement, element or method step for example can be implemented by individual unit or processor.In addition,, possibly advantageously make up these characteristics, and be included in the different claims and do not mean that combination of features is not feasible and/or favourable though each item single feature possibly be included in the different claims.In addition, comprise that in one type of claim a certain characteristic is not intended to be limited to this classification, but show that said characteristic suitably is being equally applicable to other claim classifications under the situation.In addition, the characteristic sequence in the claim does not also mean that various features must be according to any particular order of its work, and particularly each the independent step in the claim to a method order and do not mean that and must carry out said step in proper order according to this.On the contrary, can carry out said step according to any suitable order.In addition, when mentioning odd number, do not get rid of plural number.Therefore, when mentioning " one ", " one ", " first ", " second " or the like, do not get rid of plural number.Reference numeral in the claim only provides as the clarification instance, and it should not be interpreted as the scope that limits claim by any way.

Claims (15)

1. multispectral camera, it comprises:
Has the disconnected element (201) of photoresistance that at least one allows the hole (203) that light passes;
Be used for disperseing to disperse element (205) depending on the different directions of wavelength from the light of said at least one hole (203);
Be used for the lens (207) that focus on from the said light of dispersing element (205) on the picture plane (209);
Reception is from the microlens array (211) of the light of said lens (207);
Imageing sensor (213), it receives from the light of said microlens array (211) and generation and comprises the pixel value signal corresponding to the incident light numerical value of each pixel of said imageing sensor (213); And
Be used for generating the processor of multispectral image from said pixel value signal.
2. the multispectral camera of claim 1, wherein, said multispectral image comprises corresponding to the spectral distribution of each pixel of this multispectral image and representing.
3. the multispectral camera of claim 1, wherein, said processor is configured to:
First image from the synthetic rainbow plane (215) of said pixel value signal;
Through the first image applications spatial mask is generated second image, said spatial mask is corresponding to spectral characteristic; And
Generate spatial image for said multispectral image from second image corresponding to said spectral characteristic.
4. the multispectral camera of claim 3, wherein, said processor is configured to:
Through to the corresponding different spaces mask of first image applications, confirm a plurality of spatial images corresponding to different spectral characteristics; And
Generate multispectral image from said a plurality of spatial images.
5. the multispectral camera of claim 3, wherein, said spectral characteristic is corresponding to BPF..
6. the multispectral camera of claim 1; Wherein, Said microlens array (211) and imageing sensor (213) are configured such that at the light that passes said at least one hole (203) under the identical angle and are distributed on a plurality of pixels of said imageing sensor (213) that said distribution is the distribution of depending on wavelength.
7. the multispectral camera of claim 1; Wherein, Said processor is configured to compensate the single pixel value of the said imageing sensor (213) that receives light, and said light passes said at least one hole (203) corresponding to the light with different wave length and under different angles.
8. the multispectral camera of claim 1, wherein, said microlens array (211) is located substantially on picture plane (209) and locates.
9. the multispectral camera of claim 1, wherein, said microlens array (211) is positioned between picture plane (209) and the imageing sensor (213).
10. the multispectral camera of claim 1, it comprises that also the user imports and controller, this controller is used for importing the position of one of them at least of regulating microlens array (211) and imageing sensor (213) in response to the user.
11. the multispectral camera of claim 1, wherein, the said photoresistance element (201) that breaks provides the photoresistance plane of breaking, and said at least one hole (203) is the slit in the disconnected plane of said photoresistance.
12. the multispectral camera of claim 1, wherein, said imageing sensor (213) is a two-dimensional image sensor.
13. the multispectral camera of claim 1, wherein, said at least one hole (203) comprises and forms a plurality of holes of coded aperture.
14. the multispectral camera of claim 1, wherein, said at least one hole (203) comprises pin hole.
15. a method that generates multispectral image, said method comprises:
Provide and have the disconnected element (201) of photoresistance that at least one allows the hole (203) that light passes;
Be provided for disperseing to disperse element (205) depending on the different directions of wavelength from the light of said at least one hole (203);
Be provided for the lens (207) that focus on from the said light of dispersing element (205) on the picture plane (209);
The microlens array (211) of reception from the light of said lens (207) is provided;
Imageing sensor (213) is provided, and it is used for receiving from the light of said microlens array (211) and generation and comprises the pixel value signal corresponding to the incident light numerical value of each pixel of said imageing sensor (213); And
Generate multispectral image from said pixel value signal.
CN2010800355653A 2009-08-11 2010-08-10 Multi-spectral imaging Pending CN102472664A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP09167624.7 2009-08-11
EP09167624 2009-08-11
PCT/IB2010/053602 WO2011018749A1 (en) 2009-08-11 2010-08-10 Multi-spectral imaging

Publications (1)

Publication Number Publication Date
CN102472664A true CN102472664A (en) 2012-05-23

Family

ID=43028831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010800355653A Pending CN102472664A (en) 2009-08-11 2010-08-10 Multi-spectral imaging

Country Status (7)

Country Link
US (1) US9420241B2 (en)
EP (1) EP2464952B1 (en)
JP (1) JP5723881B2 (en)
KR (1) KR101721455B1 (en)
CN (1) CN102472664A (en)
RU (1) RU2535640C2 (en)
WO (1) WO2011018749A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103091258A (en) * 2013-01-29 2013-05-08 中国科学院光电研究院 Multispectral imager based on liquid zooming technology
CN103686102A (en) * 2012-09-14 2014-03-26 株式会社理光 Image capturing device and image capturing system
CN105675136A (en) * 2016-03-22 2016-06-15 深圳先进技术研究院 Coded aperture spectral imaging system
CN105737985A (en) * 2016-05-10 2016-07-06 中国工程物理研究院流体物理研究所 Digital zooming spectrum imager based on microlens array sensor
CN105874391A (en) * 2013-12-02 2016-08-17 Imec 非营利协会 Apparatus and method for performing in-line lens-free digital holography of an object
CN105931190A (en) * 2016-06-14 2016-09-07 西北工业大学 High-angular-resolution light filed obtaining device and image generation method
CN107192454A (en) * 2017-01-19 2017-09-22 中国科学院上海技术物理研究所 A kind of THz optical spectrum imagers based on three-dimensional phase grating and aperture segmentation technology
CN107402071A (en) * 2017-08-14 2017-11-28 中国科学院地理科学与资源研究所 It is a kind of to realize scene imaging and the device of multispectral survey
CN107436194A (en) * 2017-06-22 2017-12-05 北京理工大学 A kind of high light flux real time spectrum imaging device
CN108459417A (en) * 2018-02-05 2018-08-28 华侨大学 A kind of monocular narrow-band multispectral stereo visual system and its application method
CN108603789A (en) * 2016-02-02 2018-09-28 科磊股份有限公司 System and method for Hyper spectral Imaging metering
CN109073460A (en) * 2016-04-21 2018-12-21 苹果公司 Optical system for reference switching
CN109683429A (en) * 2019-02-27 2019-04-26 中国科学院上海技术物理研究所 A kind of method of the small big visual field camera job stability of F number under promotion complex environment
CN109708756A (en) * 2018-12-11 2019-05-03 南京邮电大学 Imaging spectrometer and high spatial resolution spectrum imaging method based on diffraction effect
US10718931B2 (en) 2014-12-23 2020-07-21 Apple Inc. Confocal inspection system having averaged illumination and averaged collection paths
US11035793B2 (en) 2014-12-23 2021-06-15 Apple Inc. Optical inspection system and method including accounting for variations of optical path length within a sample
CN113588085A (en) * 2021-09-03 2021-11-02 杭州纳境科技有限公司 Miniature snapshot type spectrometer
CN114839795A (en) * 2022-04-24 2022-08-02 上海交通大学 Glasses optical filter design method with blood oxygen information enhancement function and glasses
CN115656131A (en) * 2022-11-08 2023-01-31 南京溯远基因科技有限公司 Microarray shaping lens and fluorescence system for genetic detection
US11579080B2 (en) 2017-09-29 2023-02-14 Apple Inc. Resolve path optical sampling architectures
US11585749B2 (en) 2015-09-01 2023-02-21 Apple Inc. Reference switch architectures for noncontact sensing of substances
US11852318B2 (en) 2020-09-09 2023-12-26 Apple Inc. Optical system for noise mitigation
US11960131B2 (en) 2018-02-13 2024-04-16 Apple Inc. Integrated photonics device having integrated edge outcouplers

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8531581B2 (en) 2011-05-23 2013-09-10 Ricoh Co., Ltd. Focusing and focus metrics for a plenoptic imaging system
CN102494771A (en) * 2011-12-14 2012-06-13 中国科学院光电研究院 Diffractive optical imaging system and imaging spectrometer comprising same
EP2798831B1 (en) 2011-12-28 2016-08-31 Dolby Laboratories Licensing Corporation Spectral image processing
US9137441B2 (en) 2012-02-16 2015-09-15 Ricoh Co., Ltd. Spatial reconstruction of plenoptic images
JP2013195264A (en) * 2012-03-21 2013-09-30 Nikon Corp Spectroscope and microspectroscopic system
US9395516B2 (en) 2012-05-28 2016-07-19 Nikon Corporation Imaging device
US8975594B2 (en) 2012-11-09 2015-03-10 Ge Aviation Systems Llc Mixed-material multispectral staring array sensor
JP6340884B2 (en) * 2013-06-19 2018-06-13 株式会社リコー Measuring apparatus, measuring system and measuring method
US10156488B2 (en) * 2013-08-29 2018-12-18 Corning Incorporated Prism-coupling systems and methods for characterizing curved parts
CN104580877B (en) * 2013-10-29 2018-03-06 华为技术有限公司 The device and method that image obtains
DE102015100395B4 (en) * 2014-02-03 2020-06-18 Bürkert Werke GmbH Spectrometer and fluid analysis system
US9919958B2 (en) 2014-07-17 2018-03-20 Corning Incorporated Glass sheet and system and method for making glass sheet
US9955057B2 (en) 2015-12-21 2018-04-24 Qualcomm Incorporated Method and apparatus for computational scheimpflug camera
US10335045B2 (en) 2016-06-24 2019-07-02 Universita Degli Studi Di Trento Self-adaptive matrix completion for heart rate estimation from face videos under realistic conditions
FR3053464B1 (en) * 2016-06-30 2020-08-14 Office National D'etudes Et De Rech Aerospatiales FOURIER TRANSFORMED MULTI-TRACK SPECTROIMAGER
WO2018209703A1 (en) * 2017-05-19 2018-11-22 Shanghaitech University Method and system for snapshot multi-spectral light field imaging
FR3079612B1 (en) * 2018-03-29 2021-06-04 Eldim OPTICAL DEVICE ALLOWING SIMULTANEOUS MEASUREMENT OF THE ANGULAR AND SPECTRAL EMISSION OF AN OBJECT
WO2022018484A1 (en) 2020-07-20 2022-01-27 CSEM Centre Suisse d'Electronique et de Microtechnique SA - Recherche et Développement Multi-spectral light-field device
CN112097905B (en) * 2020-08-17 2022-12-20 杭州电子科技大学 Spectral microscopic imaging device
KR20230092477A (en) 2021-12-17 2023-06-26 삼성전자주식회사 Spectroscopy device, apparatus and method for estimating bio-information

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1166940C (en) * 2002-03-22 2004-09-15 中国科学院上海光学精密机械研究所 Multispectral imaging gene chip scanning instrument
CN1737515A (en) * 2004-08-18 2006-02-22 深圳大学 Method for realizing two dimensions space light spectrum distinguishing simultaneously and apparatus thereof
US20080088840A1 (en) * 2001-12-21 2008-04-17 Andrew Bodkin Hyperspectral imaging systems
CN101241235A (en) * 2007-02-09 2008-08-13 奥林巴斯映像株式会社 Decoding method, decoding apparatus and electronic camera

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6868231B2 (en) * 2002-06-12 2005-03-15 Eastman Kodak Company Imaging using silver halide films with micro-lens capture and optical reconstruction
EP1839020B1 (en) * 2005-01-19 2017-04-26 Optopo Inc. D/B/A Centice Corporation Static two-dimensional aperture coding for multimodal multiplex spectroscopy
US7968888B2 (en) * 2005-06-08 2011-06-28 Panasonic Corporation Solid-state image sensor and manufacturing method thereof
JP5255750B2 (en) * 2005-12-13 2013-08-07 ライカ マイクロシステムス ツェーエムエス ゲーエムベーハー Detector
WO2008012812A2 (en) * 2006-07-24 2008-01-31 Hyspec Imaging Ltd. Snapshot spectral imaging systems and methods
RU2332645C1 (en) * 2006-11-10 2008-08-27 ФГУП "Государственный оптический институт им. С.И. Вавилова" Small-sized hyperspectrometer based on diffraction polychromator
US20090201498A1 (en) * 2008-02-11 2009-08-13 Ramesh Raskar Agile Spectrum Imaging Apparatus and Method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080088840A1 (en) * 2001-12-21 2008-04-17 Andrew Bodkin Hyperspectral imaging systems
CN1166940C (en) * 2002-03-22 2004-09-15 中国科学院上海光学精密机械研究所 Multispectral imaging gene chip scanning instrument
CN1737515A (en) * 2004-08-18 2006-02-22 深圳大学 Method for realizing two dimensions space light spectrum distinguishing simultaneously and apparatus thereof
CN101241235A (en) * 2007-02-09 2008-08-13 奥林巴斯映像株式会社 Decoding method, decoding apparatus and electronic camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANKIT MOHAN.ET AL: "Agile Spectrum Imaging:Programmable Wavelength Modulation for Cameras and Projectors", 《COMPUTER GRAPHICS FORUM》 *

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103686102B (en) * 2012-09-14 2016-12-28 株式会社理光 Camera head and camera system
CN103686102A (en) * 2012-09-14 2014-03-26 株式会社理光 Image capturing device and image capturing system
US9307127B2 (en) 2012-09-14 2016-04-05 Ricoh Company, Ltd. Image capturing device and image capturing system
CN103091258A (en) * 2013-01-29 2013-05-08 中国科学院光电研究院 Multispectral imager based on liquid zooming technology
CN105874391B (en) * 2013-12-02 2019-04-02 Imec 非营利协会 Device and method for executing the coaxial no lens digital holography of object
CN105874391A (en) * 2013-12-02 2016-08-17 Imec 非营利协会 Apparatus and method for performing in-line lens-free digital holography of an object
US10718931B2 (en) 2014-12-23 2020-07-21 Apple Inc. Confocal inspection system having averaged illumination and averaged collection paths
US11726036B2 (en) 2014-12-23 2023-08-15 Apple Inc. Optical inspection system and method including accounting for variations of optical path length within a sample
US11035793B2 (en) 2014-12-23 2021-06-15 Apple Inc. Optical inspection system and method including accounting for variations of optical path length within a sample
US11585749B2 (en) 2015-09-01 2023-02-21 Apple Inc. Reference switch architectures for noncontact sensing of substances
CN108603789A (en) * 2016-02-02 2018-09-28 科磊股份有限公司 System and method for Hyper spectral Imaging metering
CN105675136B (en) * 2016-03-22 2018-10-16 深圳先进技术研究院 A kind of code aperture spectrum imaging system
CN105675136A (en) * 2016-03-22 2016-06-15 深圳先进技术研究院 Coded aperture spectral imaging system
US10788366B2 (en) 2016-04-21 2020-09-29 Apple Inc. Optical system for reference switching
CN109073460A (en) * 2016-04-21 2018-12-21 苹果公司 Optical system for reference switching
US11243115B2 (en) 2016-04-21 2022-02-08 Apple Inc. Optical system for reference switching
CN109073460B (en) * 2016-04-21 2022-08-23 苹果公司 Optical system for reference switching
CN105737985A (en) * 2016-05-10 2016-07-06 中国工程物理研究院流体物理研究所 Digital zooming spectrum imager based on microlens array sensor
CN105931190B (en) * 2016-06-14 2019-09-24 西北工业大学 High angular resolution light filed acquisition device and image generating method
CN105931190A (en) * 2016-06-14 2016-09-07 西北工业大学 High-angular-resolution light filed obtaining device and image generation method
CN107192454A (en) * 2017-01-19 2017-09-22 中国科学院上海技术物理研究所 A kind of THz optical spectrum imagers based on three-dimensional phase grating and aperture segmentation technology
CN107192454B (en) * 2017-01-19 2018-10-23 中国科学院上海技术物理研究所 A kind of THz optical spectrum imagers based on three-dimensional phase grating and aperture segmentation technology
CN107436194A (en) * 2017-06-22 2017-12-05 北京理工大学 A kind of high light flux real time spectrum imaging device
CN107402071A (en) * 2017-08-14 2017-11-28 中国科学院地理科学与资源研究所 It is a kind of to realize scene imaging and the device of multispectral survey
US11579080B2 (en) 2017-09-29 2023-02-14 Apple Inc. Resolve path optical sampling architectures
CN108459417B (en) * 2018-02-05 2020-06-26 华侨大学 Monocular narrow-band multispectral stereoscopic vision system and using method thereof
CN108459417A (en) * 2018-02-05 2018-08-28 华侨大学 A kind of monocular narrow-band multispectral stereo visual system and its application method
US11960131B2 (en) 2018-02-13 2024-04-16 Apple Inc. Integrated photonics device having integrated edge outcouplers
CN109708756A (en) * 2018-12-11 2019-05-03 南京邮电大学 Imaging spectrometer and high spatial resolution spectrum imaging method based on diffraction effect
CN109708756B (en) * 2018-12-11 2022-02-08 南京邮电大学 Imaging spectrometer based on diffraction effect and high spatial resolution spectral imaging method
CN109683429A (en) * 2019-02-27 2019-04-26 中国科学院上海技术物理研究所 A kind of method of the small big visual field camera job stability of F number under promotion complex environment
US11852318B2 (en) 2020-09-09 2023-12-26 Apple Inc. Optical system for noise mitigation
CN113588085A (en) * 2021-09-03 2021-11-02 杭州纳境科技有限公司 Miniature snapshot type spectrometer
CN114839795A (en) * 2022-04-24 2022-08-02 上海交通大学 Glasses optical filter design method with blood oxygen information enhancement function and glasses
CN115656131A (en) * 2022-11-08 2023-01-31 南京溯远基因科技有限公司 Microarray shaping lens and fluorescence system for genetic detection
CN115656131B (en) * 2022-11-08 2023-09-19 南京溯远基因科技有限公司 Microarray plastic lens for genetic detection and fluorescence system

Also Published As

Publication number Publication date
KR20120049331A (en) 2012-05-16
JP2013501930A (en) 2013-01-17
EP2464952B1 (en) 2018-10-10
RU2012108715A (en) 2013-09-20
WO2011018749A1 (en) 2011-02-17
EP2464952A1 (en) 2012-06-20
KR101721455B1 (en) 2017-04-10
US9420241B2 (en) 2016-08-16
RU2535640C2 (en) 2014-12-20
US20120127351A1 (en) 2012-05-24
JP5723881B2 (en) 2015-05-27

Similar Documents

Publication Publication Date Title
CN102472664A (en) Multi-spectral imaging
Cao et al. A prism-mask system for multispectral video acquisition
US7652765B1 (en) Hyper-spectral imaging methods and devices
Horstmeyer et al. Flexible multimodal camera using a light field architecture
US6859275B2 (en) System and method for encoded spatio-spectral information processing
CA2594105C (en) A system for multi- and hyperspectral imaging
US7180588B2 (en) Devices and method for spectral measurements
US8149400B2 (en) Coded aperture snapshot spectral imager and method therefor
US20050270528A1 (en) Hyper-spectral imaging methods and devices
EP2216999A1 (en) Image processing device, image processing method, and imaging device
Shrestha et al. Multispectral imaging using a stereo camera: Concept, design and assessment
WO2005088264A1 (en) Hyper-spectral imaging methods and devices
CN105210361A (en) Plenoptic imaging device
US9426383B1 (en) Digital camera for capturing spectral and spatial information
Monno et al. Optimal spectral sensitivity functions for a single-camera one-shot multispectral imaging system
CN109764964A (en) A kind of push-scanning type polarization light spectrum image-forming micro-system, imaging method and preparation method
CN107436194A (en) A kind of high light flux real time spectrum imaging device
Rueda et al. Single aperture spectral+ ToF compressive camera: toward hyperspectral+ depth imagery
Takatani et al. One-shot hyperspectral imaging using faced reflectors
WO2005086818A2 (en) Devices and method for spectral measurements
Horstmeyer et al. Modified light field architecture for reconfigurable multimode imaging
US20220103797A1 (en) Integrated Spatial Phase Imaging
TWI822342B (en) Hyperspectral camera
CN113739915A (en) Spectrum appearance, module and terminal equipment of making a video recording
Zhang Investigations into applications of photometric stereo and single-pixel imaging

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20120523