WO2015103446A2 - Device attachment with dual band imaging sensor - Google Patents

Device attachment with dual band imaging sensor Download PDF

Info

Publication number
WO2015103446A2
WO2015103446A2 PCT/US2014/073096 US2014073096W WO2015103446A2 WO 2015103446 A2 WO2015103446 A2 WO 2015103446A2 US 2014073096 W US2014073096 W US 2014073096W WO 2015103446 A2 WO2015103446 A2 WO 2015103446A2
Authority
WO
WIPO (PCT)
Prior art keywords
thermal
device attachment
images
infrared
imaging module
Prior art date
Application number
PCT/US2014/073096
Other languages
French (fr)
Other versions
WO2015103446A3 (en
Inventor
Per Elmfors
Michael Kent
Original Assignee
Flir Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/US2013/078551 external-priority patent/WO2014106276A2/en
Priority claimed from US14/246,006 external-priority patent/US9674458B2/en
Priority claimed from US14/281,883 external-priority patent/US9900478B2/en
Priority claimed from US14/299,987 external-priority patent/US9083897B2/en
Priority claimed from PCT/US2014/059200 external-priority patent/WO2015051344A1/en
Application filed by Flir Systems, Inc. filed Critical Flir Systems, Inc.
Priority to CN201480076762.8A priority Critical patent/CN106068446B/en
Priority to KR1020167021120A priority patent/KR102418369B1/en
Publication of WO2015103446A2 publication Critical patent/WO2015103446A2/en
Publication of WO2015103446A3 publication Critical patent/WO2015103446A3/en
Priority to US15/199,867 priority patent/US11297264B2/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • G01J5/0896Optical arrangements using a light source, e.g. for illuminating a surface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/0205Mechanical elements; Supports for optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/025Interfacing a pyrometer to an external device or network; User interface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/0275Control or determination of height or distance or angle information for sensors or receivers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/04Casings
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/07Arrangements for adjusting the solid angle of collected radiation, e.g. adjusting or orienting field of view, tracking position or encoding angular position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • G01J5/0803Arrangements for time-dependent attenuation of radiation signals
    • G01J5/0804Shutters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • G01J5/0846Optical arrangements having multiple detectors for performing different types of detection, e.g. using radiometry and reflectometry channels
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • G01J5/0893Arrangements to attach devices to a pyrometer, i.e. attaching an optical interface; Spatial relative arrangement of optical elements, e.g. folded beam path
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/80Calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0254Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets comprising one or a plurality of mechanically detachable modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly

Definitions

  • One or more embodiments of the invention relate generally to infrared imaging devices and more particularly, for example, to infrared imaging devices for portable equipments and, for example, to systems and methods for multi-spectrum imaging using infrared imaging devices.
  • portable electronic devices such as smart phones, cell phones, tablet devices, portable media players, portable game devices, digital cameras, and laptop computers
  • These devices typically include a visible-light image sensor or camera that allows users to take a still picture or a video clip.
  • a visible-light image sensor or camera that allows users to take a still picture or a video clip.
  • One of the reasons for the increasing popularity of such embedded cameras may be the ubiquitous nature of mobile phones and other portable electronic devices. That is, because users may already be carrying mobile phones and other portable electronic devices, such embedded cameras are always at hand when users need one.
  • Another reason for the increasing popularity may be the increasing processing power, storage capacity, and/or display capability that allow sufficiently fast capturing, processing, and storage of large, high quality images using mobile phones and other portable electronic devices.
  • image sensors used in these portable electronic devices are typically CCD-based or CMOS-based sensors limited to capturing visible light images. As such, these sensors may at best detect only a very limited range of visible light or wavelengths close to visible light (e.g., near infrared light when objects are actively illuminated with light in the near infrared spectrum). As a result, there is a need for techniques to provide infrared imaging capability in a portable electronic device form factor.
  • a device attachment configured to releasably attach to and provide infrared imaging functionality to mobile phones or other portable electronic devices.
  • a device attachment may include a housing with a partial enclosure (e.g., a tub or cutout) on a rear surface thereof shaped to at least partially receive a user device, a multi-wavelength image sensor assembly disposed within the housing and configured to capture infrared image data and visible light image data, and a processing module communicatively coupled to the multi- wavelength sensor assembly and configured to transmit the infrared image data and/or the visible light image data to the user device.
  • a partial enclosure e.g., a tub or cutout
  • a processing module communicatively coupled to the multi- wavelength sensor assembly and configured to transmit the infrared image data and/or the visible light image data to the user device.
  • the device attachment may be configured to cooperate with one or more components of an attached device such as a smartphone to capture and/or process image data.
  • an additional visible light camera on a smart phone attached to the device attachment may be used to capture additional visible light images that can be used, together with visible light images captured using a visible light image sensor in the device attachment, to measure distances to objects in a scene using the parallax of the objects between the two visible light image sensors. The measured distances can be used to align or otherwise combine infrared images from the infrared image sensor with the visible light images from the visible light imaging module.
  • a light source in a smart phone attached to the device attachment may be operated to illuminate some or all of a scene to be imaged by imaging modules in the device attachment for use in combining infrared and visible light images.
  • a timer may be used to determine when a thermal imaging module in the device attachment can be used for determining calibrated temperatures of imaged objects.
  • Fig. 1 illustrates a front perspective view of a device attachment in accordance with an embodiment of the disclosure.
  • Fig. 2 illustrates a slider module of a device attachment in accordance with an embodiment of the disclosure.
  • Fig. 3 illustrates a rear perspective view of a device attachment in accordance with an embodiment of the disclosure.
  • Fig. 4 illustrates a diagram of a device attachment and an attached device showing how non-thermal image data from the device attachment and the attached device can be used in merging non-thermal and thermal image data from the device attachment in accordance with an embodiment of the disclosure.
  • Fig. 5 illustrates a flow diagram of various operations for using non-thermal image data from a device attachment and an attached device in merging non-thermal and thermal image data from the device attachment in accordance with an embodiment of the disclosure.
  • Fig. 6 illustrates a flow diagram of various operations for calibrating non-thermal image data from a device attachment and an attached device for later use in merging non-thermal and thermal image data from the device attachment in accordance with an embodiment of the disclosure.
  • Fig. 7 illustrates a flow diagram of various operations for using a time since a calibration for determining whether calibrated image-based temperatures can be determined in accordance with an embodiment of the disclosure.
  • Fig. 8 illustrates a flow diagram of various operations to enhance imaging of a scene in accordance with an embodiment of the disclosure.
  • Fig. 9 illustrates a flow diagram of various operations to enhance imaging of a scene based on user input in accordance with an embodiment of the disclosure.
  • Infrared image sensors such as infrared imaging module 7000 can capture images of thermal energy radiation emitted from all objects having a temperature above absolute zero, and thus can be used to produce infrared images (e.g., thermograms) that can be beneficially used in a variety of situations, including viewing in a low or no light condition, detecting body temperature anomalies in people (e.g., for detecting illness), detecting invisible gases, inspecting structures for water leaks and damaged insulation, detecting electrical and mechanical equipment for unseen damages, and other situations where true infrared images may provide useful information.
  • infrared image sensors such as infrared imaging module 7000 can capture images of thermal energy radiation emitted from all objects having a temperature above absolute zero, and thus can be used to produce infrared images (e.g., thermograms) that can be beneficially used in a variety of situations, including viewing in a low or
  • Device attachment 1250 may be configured to receive a portable electronic device such as user device 1200.
  • a rear perspective view of a device attachment having a shape for receiving a device 1200 from Apple, Inc.® (e.g., iPhoneTM devices, iPadTM devices, or iPod TouchTM devices) is shown.
  • Apple, Inc.® e.g., iPhoneTM devices, iPadTM devices, or iPod TouchTM devices
  • device attachment 1250 may have a shape suitable for receiving devices from Samsung Electronics, Ltd.® (e.g., Galaxy TabTM devices, other GalaxyTM devices, or other devices from Samsung) or a smart phone, tablet or portable electronic device from any other manufacturer.
  • Samsung Electronics, Ltd.® e.g., Galaxy TabTM devices, other GalaxyTM devices, or other devices from Samsung
  • smart phone tablet or portable electronic device from any other manufacturer.
  • device attachment 1250 may include a camera window 1240 through which a device camera 101 (e.g., a non-thermal camera module such as a visible light camera module) can capture images, a device light source 103 (e.g., a camera flash or flashlight) can illuminate some or all of a scene, and or one or more other sensors 105 of device 1200 can receive or emit light.
  • Device attachment 1250 may include a plurality of imaging components such as infrared imaging module 7000 and non-thermal camera module 7002 and one or more internal electronic components such as battery 1208 or other internal components such as a processor, memory, or communications components (as examples).
  • device attachment 1250 may also include a mechanical shutter such as a user operable shutter. The user operable shutter may be moved by a user of device attachment 1250 by sliding a button 7004 (e.g., an on/off switch) to selectively block or unblock imaging components 7000 and 7002 with an internal shutter member that is attached to button 7004.
  • Fig. 2 is a perspective view of a slider assembly 248 having button 7004 and a shutter member 250 with openings 252 and 254.
  • Button 7004 may be used to push shutter member 250 along directions indicated by arrows 256 to selectively move openings 252 and 254 in front of imaging modules 7000 and 7002 of Fig. 1.
  • imaging modules 7000 and 7002 may receive light from a scene through openings 252 and 254 for image capture operations.
  • button 7004 is moved so that a portion of shutter member 250 blocks imaging modules 7000 and/or 7002, light from the scene may be prevented from reaching imaging modules 7000 and/or 7002.
  • button 7004 may be configured to power device attachment 1250 on or off while moving shutter member 250 to block or unblock imaging components 7000 and 7002.
  • shutter member 250 may be used, for example, to protect imaging components 7000 and 7002 when not in use.
  • Shutter 250 may also be used as a temperature reference as part of a calibration process (e.g., a non-uniformity correction (NUC) process as described in U.S. Patent Application No. 14/099,818 filed December 6, 2013 which is incorporated by reference herein in its entirety, a radiometric calibration process, and/or other calibration processes ) for infrared imaging module 7000 as would be understood by one skilled in the art.
  • Device attachment 1250 may include a front portion 7007 and a rear portion 7009.
  • Front portion 7007 may be formed from a housing that encloses functional components of the device attachment such as a battery, connectors, imaging components, processors, memory, communications components, and/or other components of a device attachment as described herein.
  • Rear portion 7009 may be a structural housing portion having a shape that forms a recess into which user device 1200 is configured to be releasably attached.
  • Fig. 3 is a front perspective view of the device attachment of Fig. 1 showing how a user device 1200 from Apple, Inc.® having a display 201 may be releasably attached to device attachment 1250 by inserting the device into a recess in a housing for the device attachment formed from a rear wall and at least one sidewall that at least partially surround the device.
  • Device attachment 1250 may include a device connector that carries various signals and electrical power to and from user device 1200 when attached.
  • the device connector may be disposed at a location that is suitably aligned with a corresponding device connector receptacle or socket of user device 1200, so that the device connector can engage the corresponding device connector receptacle or socket of user device 1200 when device attachment 1250 is attached to user device 1200.
  • the device connector may be positioned at an appropriate location on a bottom side wall of device attachment 1250.
  • the device connector may also include a mechanical fixture (e.g., a locking/latched connector plug) used to support and/or align user device 1200.
  • the device connector may be implemented according to the connector specification associated with the type of user device 1200.
  • the device connector may implement a proprietary connector (e.g., an Apple® dock connector for iPodTM and iPhoneTM such as a "Lightning" connector, a 30-pin connector, or others) or a standardized connector (e.g., various versions of Universal Serial Bus (USB) connectors, Portable Digital Media
  • PDMI Peripher Interface
  • the device connector may be interchangeably provided, so that device attachment 1250 may accommodate different types of user devices that accept different device connectors.
  • various types of device connector plugs may be provided and configured to be attached to a base connector device attachment 1250, so that a connector plug that is compatible with user device 1200 can be attached to the base connector before attaching device attachment 1250 to user device 1200.
  • the device connector may be fixedly provided.
  • Device attachment 1250 may also communicate with user device 1200 via a wireless connection.
  • device attachment 1250 may include a wireless communication module configured to facilitate wireless communication between user device 1200 and device attachment 1250.
  • a wireless communication module may support the IEEE 802.1 1 WiFi standards, the BluetoothTM standard, the ZigBeeTM standard, or other appropriate short range wireless communication standards.
  • device attachment 1250 may be used with user device 1200 without relying on the device connector, if a connection through the device connector is not available or not desired.
  • Infrared imaging module 7000 may be implemented, for one or more embodiments, with a small form factor and in accordance with wafer level packaging techniques or other packaging techniques.
  • Infrared imaging module 7000 may include a lens barrel, a housing, an infrared sensor assembly, a circuit board, a base, and a processing module.
  • An infrared sensor assembly may include a plurality of infrared sensors (e.g., infrared detectors) implemented in an array or other fashion on a substrate and covered by a cap.
  • an infrared sensor assembly may be implemented as a focal plane array (FPA).
  • FPA focal plane array
  • Such a focal plane array may be implemented, for example, as a vacuum package assembly.
  • an infrared sensor assembly may be implemented as a wafer level package (e.g., singulated from a set of vacuum package assemblies provided on a wafer). In one embodiment, an infrared sensor assembly may be implemented to operate using a power supply of approximately 2.4 volts, 2.5 volts, 2.8 volts, or similar voltages.
  • Infrared sensors in infrared imaging module 7000 may be configured to detect infrared radiation (e.g., infrared energy) from a target scene including, for example, mid wave infrared wave bands (MWIR), long wave infrared wave bands (LWIR), and/or other thermal imaging bands as may be desired in particular implementations.
  • Infrared sensors may be implemented, for example, as microbolometers or other types of thermal imaging infrared sensors arranged in any desired array pattern to provide a plurality of pixels.
  • User device 1200 may be any type of portable electronic device that may be configured to communicate with device attachment 1250 to receive infrared images captured by infrared sensor assembly 7000 and/or non-thermal images such as visible light images from non-thermal imaging module 7002.
  • Infrared image data captured by infrared imaging module 7000 and/or non-thermal image data such as visible light image data captured by non-thermal imaging module 7002 may be provided to a processing module of device attachment 1250 and/or device 1200 for further processing.
  • the processing module may be configured to perform appropriate processing of captured infrared image data, and transmit raw and/or processed infrared image data to user device 1200. For example, when device attachment 1250 is attached to user device 1200, a processing module may transmit raw and/or processed infrared image data to user device 1200 via a wired device connector or wirelessly via appropriate wireless components further described herein.
  • user device 1200 may be appropriately configured to receive the infrared image data (e.g., thermal image data) and/or non-thermal image data from device attachment 1250 to display user-viewable infrared images (e.g., thermograms) to users on display 201 and permit users to store infrared image data non-thermal image data, multi-wavelength image data, and/or user-viewable infrared images. That is, user device 1200 may be configured to run appropriate software instructions (e.g., a smart phone "app") to function as an infrared camera that permits users to frame and take infrared, non-infrared, and/or combined still images, videos, or both.
  • Device attachment 1250 and user device 1200 may be configured to perform other infrared imaging functionalities, such as storing and/or analyzing thermographic data (e.g., temperature information) contained within infrared image data.
  • Device attachment 1250 may also include a battery 1208 (see, e.g., Fig. 1). Battery 1208 may be configured to be used as a power source for internal components of device attachment 1250, so that device attachment 1250 does not drain the battery of user device 1200 when attached. Further, battery 1208 of device attachment 1250 may be configured to provide electrical power to user device 1200, for example, through a device connector. Thus, the battery 1208 may beneficially provide a backup power for user device 1200 to run and charge from. Conversely, various components of device attachment 1250 may be configured to use electrical power from a battery of user device 1200 (e.g., through a device connector), if a user desires to use functionalities of device attachment 1250 even when the battery of device attachment 1250 is drained.
  • a battery 1208 see, e.g., Fig. 1
  • Battery 1208 may be configured to be used as a power source for internal components of device attachment 1250, so that device attachment 1250 does not drain the battery of user device 1200 when attached
  • a non-thermal camera module 101 of device 1200 may be used together with non-thermal camera module 7002 of device attachment 1250.
  • infrared e.g., thermal
  • non-thermal e.g., visible
  • the two images may be mapped to each other pixel by pixel. Differences between the two cameras (e.g., distortion, parallax, pointing angle, etc.) can be compensated.
  • Imaging modules 7000 and 7002 may be mounted close to each other to reduce parallax differences between images captured with the imaging modules.
  • non-thermal camera 101 in the device 1200 can be used in conjunction with non-thermal camera module 7002 to determine the distance to the objects in a scene. The determined distance can then be used to adjust the alignment of infrared (e.g., thermal) and non-thermal (e.g., visible) video images even at variable scene distances.
  • infrared e.g., thermal
  • non-thermal e.g., visible
  • non-thermal camera module 7002 and non-thermal camera module 101 can each provide a non-thermal (e.g., visible) image of a scene to processing circuitry such as distance measure engine 301.
  • Distance measure engine 301 can determine the distance to scene objects using the known distance D between non-thermal camera module 7002 and non-thermal camera module 101 and a measured shift in position of the scene objects in the images provided by non-thermal camera module 7002 and non-thermal camera module 101.
  • the measured distance, the non-thermal image captured by non-thermal imaging module 7002, and a thermal image (e.g., an infrared image) from thermal imaging module 7000 can be provided to processing circuitry such as merge engine 303.
  • Merge engine 303 can use the measured distance to correct any remaining parallax differences between the thermal image and the non-thermal image so that the thermal image and the non-thermal image can be combined and provided to display 201 for display to a user.
  • Distance measure engine 301 and merge engine 303 may represent algorithms performed by a logic device (e.g., a programmable logic device or microprocessor).
  • Fig. 5 is a flow chart of illustrative operations for using a non-thermal imaging module in a device attachment and a non-thermal imaging module in an attached device to provide a parallax correction for images captured by the non-thermal imaging module in a device attachment and a thermal imaging module in the device attachment.
  • a first non-thermal image may be captured using a non-thermal image sensor in the device attachment and, optionally, a thermal image may be captured using a thermal image sensor in the device attachment.
  • a second non-thermal image may be captured using a non-thermal image sensor in a device camera.
  • a distance may be determined to a scene object using the first and second non-thermal images (e.g., by determining a parallax-induced shift of the object in the first and second non-thermal images and triangulating the distance to the object using the determined shift and the known relative locations of the non-thermal image sensor in the device attachment and the non-thermal image sensor in the device).
  • the known relative locations may be determined based on the known positions of the non-thermal image sensors in each respective device and the known position of the device within the device attachment and/or based on a calibration operation performed by capturing an image of an object at a known distance using both of the non-thermal image sensors and determining the relative locations of the non-thermal image sensors using the images of the object and the known distance.
  • the capturing of the non-thermal images may be controlled to improve accuracy of determining a parallax-induced shift between the first and second non-thermal images, which in turn would improve the accuracy of the determined distance and the parallax correction.
  • the accuracy of determining a parallax-induced shift may be affected due to a shift and/or blurring of the objects in the images caused by the motion.
  • Such a motion-induced shift may occur, for example, if the timing of the capturing by the first and .second non-thermal image sensors is not adequately synchronized.
  • operations of Fig. 5 may involve detecting a movement of the device and/or device attachment (e.g., by an accelerometer or other types of motion detector provided in the device and/or the device attachment) and/or detecting a movement of a target object in the scene (e.g., by processing captured images as would be understood by one skilled in the art).
  • the non-thermal images may be captured when the detected movement is below a desired threshold and/or the captured images synchronized to obtain non-thermal images less affected by motion.
  • the 5 may involve capturing multiple frames of non-thermal images by the first and second non-thermal image sensors, respectively, while operating a light source (e.g., light source 103 of user device 1200) to flash (e.g., illuminate for a short period of time) all or some of the scene.
  • the frames captured by the first non-thermal image sensor may be processed to detect and select a frame containing an image of the flashed scene.
  • a frame containing an image of the flashed scene e.g., a frame at the start, middle, or end of the flash event
  • the selected frames may be substantially synchronized to the moment (or some moment in time) when the scene was illuminated, thereby reducing the effects, if any, of a motion-induced shift (e.g., achieve sufficient synchronization of the captured images).
  • the thermal image and the first non-thermal image may be combined using the determined distance to the object (e.g., by performing a parallax correction between the thermal image and the first non-thermal image using the determined distance).
  • any distortion and alignment error between the non-thermal camera module in the device attachment and the non-thermal camera module in the device can be calibrated.
  • an image may be captured of an object such as a hand in front of a thermally and visually uniform background using the non-thermal camera module in the device attachment, the thermal imaging module in the device attachment, and the non-thermal camera module in the device.
  • Processing circuitry e.g., a smartphone app running on the device processor
  • Fig. 6 is a flowchart of operations for calibrating a distortion and/or alignment between a non-thermal camera module in the device attachment and a non-thermal camera module in the device.
  • images may be captured of an object (e.g., a hand) at a common time using each of a thermal image sensor in a device attachment, a non-thermal image sensor in the device attachment, and a non-thermal image sensor in an attached device.
  • edges of the object in each captured image may be detected.
  • alignment and distortion corrections between the non-thermal image sensor in the device attachment and the non-thermal image sensor in the attached device may be determined based on the locations in the images of the detected edges.
  • the alignment and distortion corrections may be stored (e.g., in the device attachment or the device) for use in distance measurements for parallax corrections between images captured using the thermal image sensor and the non-thermal image sensor in the device attachment.
  • a non-thermal image sensor in a user device (e.g., a phone camera) and a thermal image sensor and a non-thermal image sensor in a device attachment
  • a user device e.g., a phone camera
  • a thermal image sensor and a non-thermal image sensor in a device attachment it is also contemplated that the principles and spirit of the present disclosure may be applied to any other appropriate combination of image sensors in the user device and/or the device attachment.
  • a non-thermal image sensor of the user device and a non-thermal image sensor of a device attachment for the user device may be utilized to provide a parallax correction between the thermal image sensor of the user device and the non-thermal image sensor of either the user device or the device attachment.
  • the non-thermal image sensors of the user device may be utilized to provide parallax correction for image sensors of the device (and/or the device attachment if present).
  • thermal imaging module 7000 may be used to determine an image-based calibrated temperature of an object (e.g., by capturing one or more calibrated thermal images and determining from the intensity and/or spectrum of the object in the thermal images, the temperature of the object as would be understood by one skilled in the art). The accuracy of this type of image-based temperature measurement can be improved by ensuring that the thermal imaging module has been recently calibrated when an image-based temperature measurement is to be made.
  • Fig. 7 is a flow chart of illustrative operations for ensuring that the thermal imaging module has been recently calibrated when an image-based temperature measurement is to be made.
  • a system such as a system that includes a device attachment having a thermal image sensor and an attached device may perform a calibration of a thermal image sensor such as a thermal image sensor in a device attachment using a closed shutter (e.g., by closing the shutter and capturing one or more images of the shutter using the thermal image sensor).
  • the system may monitor the time since the last calibration of the thermal image sensor (e.g., by a processor in the device attachment or a processor in an attached device).
  • the system may receive a request for an image -based temperature determination from a user.
  • the system may determine whether the time since calibration is less than a maximum allowable time using the monitored time.
  • the maximum allowable time may be, as examples, less than 20 seconds since the last calibration, less than 10 seconds since the last calibration, less than one minute since the last calibration or less than 30 seconds since the last calibration.
  • the system may proceed to block 608.
  • one or more thermal images and/or an infrared spectrum of an object may be captured.
  • the system may determine the temperature of the object from thermal images and/or the infrared spectrum.
  • the system may proceed to block 612.
  • the system may instruct user to perform a new calibration of the thermal imaging module using the closed shutter to ensure that the subsequent temperature measurement is accurate.
  • a light source in a portable electronic device that is attached to a device attachment having a thermal imaging module may be used in cooperation with the thermal imaging module and a non-thermal imaging module to enhance imaging of a scene.
  • light source 103 of device 1200 may be used to illuminate at least a portion of a scene in a spectrum sensed by one or more of imaging modules 7000, 7002, and/or 101.
  • Light source 103 can be flashed or operated in a flashlight mode to illuminate some or all of the scene during image capture operates using imaging modules 7000 and 7002.
  • Light source 103 may be turned on and/or flashed in response to user input or may be automatically turned on and/or flashed based on, for example, a light level determined using imaging module 7002, device camera 101 , or other light sensor.
  • Fig. 8 illustrates a flow diagram of various operations to enhance imaging of a scene using thermal images and active illumination of the scene according to an embodiment.
  • thermal image data may be captured using a thermal image sensor in a device attachment and non-thermal image data may be captured using a non-thermal image sensor in the device attachment. If desired, additional non-thermal image data may be captured using a camera in an attached device.
  • a light source of an attached device may be operated.
  • the light source may be operated (e.g., flashed or held on) during image capture operations based on, for example, user input and/or automatically determined light levels. Illuminating the scene using the light source may enhance the non-thermal images captured by the non-thermal image sensor in the device attachment.
  • the captured thermal image data and the captured non-thermal image data from the device attachment may be combined to form an enhanced output image that includes some or all of the thermal image data and actively illuminated non-thermal image data.
  • thermal and non-thermal images may be processed to generate combined images using high contrast processing.
  • high spatial frequency content may be obtained from one or more of the thermal and non-thermal images (e.g., by performing high pass filtering, difference imaging, and/or other techniques).
  • a combined image may include a radiometric component of a thermal image and a blended component including infrared (e.g., thermal) characteristics of a scene blended with the high spatial frequency content, according to a blending parameter, which may be adjustable by a user and/or machine in some embodiments.
  • high spatial frequency content from non-thermal images may be blended with thennal images by superimposing the high spatial frequency content onto the thermal images, where the high spatial frequency content replaces or overwrites those portions of the thermal images corresponding to where the high spatial frequency content exists.
  • the high spatial frequency content may include edges of objects depicted in images of a scene, but may not exist within the interior of such objects. In such
  • blended image data may simply include the high spatial frequency content, which may subsequently be encoded into one or more components of combined images.
  • a radiometric component of thermal image may be a chrominance component of the thermal image
  • the high spatial frequency content may be derived from the luminance and/or chrominance components of a non-thermal image.
  • a combined image may include the radiometric component (e.g., the chrominance component of the thermal image) encoded into a chrominance component of the combined image and the high spatial frequency content directly encoded (e.g., as blended image data but with no thermal image contribution) into a luminance component of the combined image.
  • blended image data may include the high spatial frequency content added to a luminance component of the thermal images, and the resulting blended data encoded into a luminance component of resulting combined images.
  • the non-thermal image may be from any type of non-thermal imager, including for example a visible light imager, a low light visible ligh imager, a CCD imaging device, an EMCCD imaging device, a CMOS imaging device, a sCMOS imaging device, a NIR imaging device, a S WIR imaging device, or other types of non-thermal imagers (e.g., including passive or active illumination as would be understood by one skilled in the art).
  • any one of device attachment 1250 or device 1200 may be configured to receive user input indicating a portion of interest to be imaged by a first imaging module (e.g., infrared imaging module 7000), control the light source 103 to illuminate at least the portion-of-interest in a spectrum sensed by a second imaging module (e.g., visible spectrum imaging module 7002 and/or 101), receive illuminated captured images of the first imaging module (e.g., infrared imaging module 7000), control the light source 103 to illuminate at least the portion-of-interest in a spectrum sensed by a second imaging module (e.g., visible spectrum imaging module 7002 and/or 101), receive illuminated captured images of the first imaging module (e.g., infrared imaging module 7000), control the light source 103 to illuminate at least the portion-of-interest in a spectrum sensed by a second imaging module (e.g., visible spectrum imaging module 7002 and/or 101), receive illuminated captured images of the first imaging module (e.g., infrare
  • a thermal image may be used to detect a "hot" spot in an image, such as an image of a circuit breaker box.
  • Light source 103 may be used to illuminate a label of a circuit breaker to provide a better image and potentially pin point the cause of the hot spot.
  • Fig. 9 illustrates a flow diagram of various operations to enhance imaging of a scene based on user input in accordance with an embodiment of the disclosure.
  • one or more portions of process 5800 may be performed by device attachment 1250, device 1200 and/or each of imaging modules 7000 and/or 7002 and utilizing any of the components described and/or enumerated therein.
  • any step, sub-step, sub-process, or block of process 5800 may be performed in an order or arrangement different from the embodiment illustrated by Fig. 9.
  • any portion of process 5800 may be implemented in a loop so as to continuously operate on a series of infrared and/or visible spectrum images, such as a video of a scene.
  • process 5800 may be implemented in a partial feedback loop including display of intermediary processing (e.g., after or while receiving infrared and/or visible spectrum images, registering images to each other, generating illuminated and/or combined images, or performing other processing of process 5800) to a user, for example, and/or including receiving user input, such as user input directed to any intermediary processing step.
  • process 5800 may include one or more steps, sub-steps, sub-processes, or blocks of any of the other processes described herein.
  • device attachment 1250 generates visible spectrum images of a scene.
  • imaging module 7002 may be configured to generate one or more visible spectrum images of a scene.
  • block 5810 may include one or more operations discussed with reference to the processes of Figs. 5-8.
  • a device camera such as camera 101 may also capture visible spectrum images.
  • device attachment 1250 generates infrared images of the scene.
  • imaging modules 7000 may be configured to generate one or more infrared images of the scene.
  • block 5812 may include one or more operations discussed with reference to the processes of Figs. 5-8.
  • device attachment 1250 produces an output signal of data corresponding to the generated images.
  • any one of imaging modules 7000 or 7002 and/or a processor may be adapted to produce an output signal of data corresponding to the images generated in blocks 5810 and 5812.
  • the output signal may adhere to a particular interface standard, for example, such as MIPI®.
  • device attachment 1250 and/or device 1200 stores the data according to a common data format.
  • the data may be stored in a desired data file according to a common data format.
  • device attachment 1250 and/or device 1200 registers the images to each other.
  • device attachment 1250 and/or device 1200 may be adapted to register any one of the generated images to another one of the generated images by performing one or more of interpolation, scaling, cropping, rotational transformation, morphing, and/or filtering operations on one or more of the images to substantially match spatial content within the images.
  • device attachment 1250 and/or device 1200 may be adapted to register images to each other using one or more of the processes described in connection with Figs. 4-6.
  • device attachment 1250 and/or device 1200 receives user input indicating a portion-of-interest of the scene.
  • device attachment 1250 and/or device 1200 may be adapted to receive user input provided by one or more other components, a touchscreen display, and/or other devices indicating a portion-of-interest of the already imaged scene.
  • the user input may be used to designate a pixel or group of pixels corresponding to the portion-of-interest.
  • the user input may be combined with the selection of registration operations performed in block 5840 to determine corresponding pixels in a variety of captured images.
  • a light source such as light source 103 of device 1200 illuminates the portion of interest.
  • any one of device attachment 1250 and/or device 1200 may be adapted to control light source 103 to illuminate all or a designated portion of interest in a particular scene.
  • a particular spectrum and/or portion of a scene may be selected by controlling a MEMS lens and/or other system coupled or otherwise associated with an illumination module.
  • device attachment 1250 and/or device 1200 generates illuminated images of the portion-of- interest.
  • any one of imaging modules 7000, 7002, or 101 sensitive to the spectrum illuminated in block 5852 may be adapted to generate an illuminated image that is captured while light source 103 is illuminating at least the portion-of-interest designated in block 5850.
  • a combined image may include a visible spectrum image with embedded data corresponding to infrared image data for each pixel of visible spectrum data.
  • a user may select a pixel or group of pixels with a user interface and text corresponding to the infrared image data may be displayed alongside the visible spectrum image, such as in a text box or legend, for example.
  • any one of imaging modules 7000, 7002, and/or 101 may be adapted to generate combined images using one or more of the processes described herein, including processes described in connection with Figs. 5-8.
  • device 1200 displays one or more of the generated images.
  • device 1200 may be adapted to use a display (e.g., display 201 in Fig. 2) to display one or more of the images generated in process 5800.
  • block 5870 may include one or more operations discussed with reference to processes of Figs. 4-6.

Abstract

Various techniques are disclosed for providing a device attachment configured to releasably attach to and provide infrared imaging functionality to mobile phones or other portable electronic devices. The device attachment may include an infrared imagining module and a non-thermal imaging module that cooperate with one or more of a non-thermal imaging module in an attached device and a light source in the attached device for capturing and processing images.

Description

DEVICE ATTACHMENT WITH DUAL BAND IMAGING SENSOR
Per Elmfors and Michael Kent CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Patent Application No.
61/923,732 filed January 5, 2014 and entitled "DEVICE ATTACHMENT WITH DUAL BAND IMAGING SENSOR" which is hereby incorporated by reference in its entirety.
This application is a continuation-in-part of U.S. Patent Application No. 14/281,883 filed May 19, 2014 and entitled "DEVICE ATTACHMENT WITH INFRARED IMAGING SENSOR" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/281 ,883 is a continuation-in-part of International Patent Application No. PCT US2013/062433 filed September 27, 2013 and entitled "DEVICE ATTACHMENT WITH INFRARED IMAGING SENSOR" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2013/062433 claims the benefit of U.S. Provisional Patent Application No. 61/880,827 filed September 20, 2013 and entitled
"DEVICE ATTACHMENT WITH INFRARED IMAGING SENSOR" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT US2013/062433 is a continuation-in-part of
U.S. Patent Application No. 13/901,428 filed May 23, 2013 and entitled "DEVICE
ATTACHMENT WITH INFRARED IMAGING SENSOR" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 13/901,428 claims the benefit of U.S. Provisional Patent Application No. 61/652,075 filed May 25, 2012 and entitled "DEVICE ATTACHMENT WITH INFRARED IMAGING SENSOR" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 13/901,428 is a continuation-in-part of U.S. Design Patent Application No. 29/423,027 filed May 25, 2012 and entitled "DEVICE ATTACHMENT WITH CAMERA" which is hereby incorporated by reference in its entirety.
This application is a continuation-in-part of International Patent Application No. PCT/US2013/78551 filed December 31, 2013 and entitled "INFRARED IMAGING DEVICE HAVING A SHUTTER" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2013/78551 claims the benefit of U.S. Provisional Patent Application No. 61/747,789 filed December 31, 2012 and entitled "INFRARED IMAGING DEVICE HAVING A SHUTTER" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2013/78551 is a continuation-in-part of U.S. Patent Application No. 13/966,052 filed August 13, 2013 and entitled "INFRARED CAMERA SYSTEM HOUSING WITH METALIZED SURFACE" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 13/966,052 claims the benefit of U.S. Provisional Patent Application No. 61/683,124 filed August 14, 2012 and entitled "INFRARED CAMERA SYSTEM HOUSING WITH METALIZED SURFACE" which is hereby incorporated by reference in its entirety.
This application is a continuation-in-part of International Patent Application No.
PCT/US2014/59200 filed October 3, 2014 and entitled "DURABLE COMPACT
MULTISENSOR OBSERVATION DEVICES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2014/59200 is a continuation-in-part of U.S. Patent Application No. 14/101,245 filed December 9, 2013 and entitled "LOW POWER AND SMALL FORM FACTOR INFRARED IMAGING" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/101,245 is a continuation of International Patent Application No. PCT/US2012/041744 filed June 8, 2012 and entitled "LOW POWER AND SMALL FORM FACTOR INFRARED IMAGING" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2012/041744 claims the benefit of U.S. Provisional Patent Application No. 61/656,889 filed June 7, 2012 and entitled "LOW POWER AND SMALL FORM FACTOR INFRARED IMAGING" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2012/041744 claims the benefit of U.S. Provisional Patent Application No. 61/545,056 filed October 7, 2011 and entitled
"NON-UNIFORMITY CORRECTION TECHNIQUES FOR INFRARED IMAGING DEVICES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT US2012/041744 claims the benefit of U.S. Provisional Patent Application No. 61/495,873 filed June 10, 2011 and entitled "INFRARED CAMERA PACKAGING SYSTEMS AND METHODS" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2012/041744 claims the benefit of U.S. Provisional Patent Application No. 61/495,879 filed June 10, 2011 and entitled "INFRARED CAMERA SYSTEM ARCHITECTURES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2012/041744 claims the benefit of U.S. Provisional Patent Application No. 61/495,888 filed June 10, 2011 and entitled "INFRARED CAMERA CALIBRATION TECHNIQUES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2014/59200 is a continuation-in-part of U.S. Patent Application No. 14/099,818 filed December 6, 2013 and entitled
"NON-UNIFORMITY CORRECTION TECHNIQUES FOR TNFRARED IMAGING DEVICES" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/099,818 is a continuation of International Patent
Application No. PCT/US2012/041749 filed June 8, 2012 and entitled "NON- UNIFORMITY CORRECTION TECHNIQUES FOR INFRARED IMAGING DEVICES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCJ7US2012/041749 claims the benefit of U.S. Provisional Patent Application No. 61/545,056 filed October 7, 201 1 and entitled
"NON-UNIFORMITY CORRECTION TECHNIQUES FOR INFRARED IMAGING DEVICES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2012/041749 claims the benefit of U.S. Provisional Patent Application No. 61/495,873 filed June 10, 2011 and entitled "INFRARED CAMERA PACKAGING SYSTEMS AND METHODS" which is hereby incorporated by reference in its entirety. International Patent Application No. PCT US2012/041749 claims the benefit of U.S. Provisional Patent Application No. 61/495,879 filed June 10, 2011 and entitled "INFRARED CAMERA SYSTEM ARCHITECTURES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2012/041749 claims the benefit of U.S.
Provisional Patent Application No. 61/495,888 filed June 10, 2011 and entitled "INFRARED CAMERA CALIBRATION TECHNIQUES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2014/59200 is a continuation-in-part of U.S. Patent Application No. 14/101,258 filed December 9, 2013 and entitled "INFRARED CAMERA SYSTEM ARCHITECTURES" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/101,258 is a continuation of International Patent Application No. PCT/US2012/041739 filed June 8, 2012 and entitled "INFRARED CAMERA SYSTEM ARCHITECTURES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2012/041739 claims the benefit of U.S. Provisional Patent Application No. 61/495,873 filed June 10, 2011 and entitled "INFRARED CAMERA PACKAGING SYSTEMS AND METHODS" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2012/041739 claims the benefit of U.S.
Provisional Patent Application No. 61/495,879 filed June 10, 201 1 and entitled "INFRARED CAMERA SYSTEM ARCHITECTURES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2012/041739 claims the benefit of U.S. Provisional Patent Application No. 61/495,888 filed June 10, 201 1 and entitled "INFRARED CAMERA CALIBRATION TECHNIQUES" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2014/59200 is a continuation-in-part of U.S. Patent Application No. 14/138,058 filed December 21, 2013 and entitled "COMPACT MULTI-SPECTRUM IMAGING WITH FUSION" which is hereby incorporated by reference in its entirety. U.S. Patent Application No. 14/138,058 claims the benefit of U.S. Provisional Patent Application No. 61/748,018 filed December 31, 2012 and entitled "COMPACT
MULTI-SPECTRUM IMAGING WITH FUSION" which is hereby incorporated by reference in its entirety.
This patent application is a continuation-in-part of U.S. Patent Application No.
14/299,987 filed June 9, 2014 and entitled "INFRARED CAMERA SYSTEMS AND
METHODS FOR DUAL SENSOR APPLICATIONS" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/299,987 is a continuation of U.S. Patent Application No. 12/477,828 filed June 3, 2009 and entitled "INFRARED CAMERA SYSTEMS AND METHODS FOR DUAL SENSOR APPLICATIONS" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/US2014/59200 is a continuation-in-part of U.S. Patent Application No. 14/138,040 filed December 21, 2013 and entitled "TIME
SPACED INFRARED IMAGE ENHANCEMENT" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/138,040 claims the benefit of U.S. Provisional Patent Application No. 61/792,582 filed March 15, 2013 and entitled "ΤΓΜΕ SPACED INFRARED IMAGE ENHANCEMENT" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/138,040 also claims the benefit of U.S. Provisional
Patent Application No. 61/746,069 filed December 26, 2012 and entitled "TIME SPACED INFRARED IMAGE ENHANCEMENT" which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT US2014/59200 is a continuation-in-part of U.S. Patent Application No. 14/138,052 filed December 21, 2013 and entitled "INFRARED IMAGING ENHANCEMENT WITH FUSION" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/138,052 claims the benefit of U.S. Provisional Patent Application No. 61/793,952 filed March 15, 2013 and entitled "INFRARED IMAGING ENHANCEMENT WITH FUSION" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/138,052 also claims the benefit of U.S. Provisional Patent Application No. 61/746,074 filed December 26, 2012 and entitled "INFRARED IMAGING ENHANCEMENT WITH FUSION" which is hereby incorporated by reference in its entirety.
This patent application is a continuation-in-part of U.S. Patent Application No.
14/246,006 filed April 4, 2014 entitled "SMART SURVEILLANCE CAMERA SYSTEMS AND METHODS" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/246,006 is a continuation-in-part of U.S. Patent Application No. 13/437,645 filed April 2, 2012 and entitled "INFRARED RESOLUTION AND CONTRAST ENHANCEMENT WITH FUSION" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 13/437,645 is a continuation-in-part of U.S. Patent Application No. 13/105,765 filed May 11, 2011 and entitled "INFRARED RESOLUTION AND CONTRAST ENHANCEMENT WITH FUSION" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 13/437,645 also claims the benefit of U.S. Provisional
Patent Application No. 61/473,207 filed April 8, 2011 and entitled "INFRARED
RESOLUTION AND CONTRAST ENHANCEMENT WITH FUSION" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 13/437,645 is also a continuation-in-part of U.S. Patent Application No. 12/766,739 filed April 23, 2010 and entitled "INFRARED RESOLUTION AND CONTRAST ENHANCEMENT WITH FUSION " which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 13/105,765 is a continuation of International Patent Application No. PCT/EP2011/056432 filed April 21, 2011 and entitled "INFRARED RESOLUTION AND CONTRAST ENHANCEMENT WITH FUSION" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 13/105,765 is also a continuation-in-part of U.S. Patent Application No. 12/766,739 which is hereby incorporated by reference in its entirety.
International Patent Application No. PCT/EP2011/056432 is a continuation-in-part of U.S. Patent Application No. 12/766,739 which is hereby incorporated by reference in its entirety. International Patent Application No. PCT/EP201 1/056432 also claims the benefit of U.S. Provisional Patent Application No. 61/473,207 which is hereby incorporated by reference in its entirety.
This patent application is a continuation-in-part of U.S. Patent Application No.
14/029,716 filed September 17, 2013 and entitled "ROW AND COLUMN NOISE
REDUCTION IN THERMAL IMAGES" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/029,716 claims the benefit of U.S. Provisional Patent Application No. 61/745,489 filed December 21, 2012 and entitled "ROW AND COLUMN NOISE REDUCTION IN THERMAL IMAGES" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/029,716 claims the benefit of U.S. Provisional Patent Application No. 61/745,504 filed December 21, 2012 and entitled "PIXEL- WISE NOISE REDUCTION IN THERMAL IMAGES" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 14/029,716 is a continuation-in-part of U.S. Patent Application No. 13/622,178 filed September 18, 2012 and entitled "SYSTEMS AND
METHODS FOR PROCESSING INFRARED IMAGES" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 13/622,178 is a continuation-in-part of U.S. Patent
Application No. 13/529,772 filed June 21, 2012 and entitled "SYSTEMS AND METHODS FOR PROCESSING INFRARED IMAGES" which is hereby incorporated by reference in its entirety.
U.S. Patent Application No. 13/529,772 is a continuation of U.S. Patent Application No. 12/396,340 filed March 2, 2009 and entitled "SYSTEMS AND METHODS FOR
PROCESSING INFRARED IMAGES" is incorporated herein by reference in its entirety.
TECHNICAL FIELD
One or more embodiments of the invention relate generally to infrared imaging devices and more particularly, for example, to infrared imaging devices for portable equipments and, for example, to systems and methods for multi-spectrum imaging using infrared imaging devices. BACKGROUND
Various types of portable electronic devices, such as smart phones, cell phones, tablet devices, portable media players, portable game devices, digital cameras, and laptop computers, are in widespread use. These devices typically include a visible-light image sensor or camera that allows users to take a still picture or a video clip. One of the reasons for the increasing popularity of such embedded cameras may be the ubiquitous nature of mobile phones and other portable electronic devices. That is, because users may already be carrying mobile phones and other portable electronic devices, such embedded cameras are always at hand when users need one. Another reason for the increasing popularity may be the increasing processing power, storage capacity, and/or display capability that allow sufficiently fast capturing, processing, and storage of large, high quality images using mobile phones and other portable electronic devices.
However, image sensors used in these portable electronic devices are typically CCD-based or CMOS-based sensors limited to capturing visible light images. As such, these sensors may at best detect only a very limited range of visible light or wavelengths close to visible light (e.g., near infrared light when objects are actively illuminated with light in the near infrared spectrum). As a result, there is a need for techniques to provide infrared imaging capability in a portable electronic device form factor.
SUMMARY Various techniques are disclosed for providing a device attachment configured to releasably attach to and provide infrared imaging functionality to mobile phones or other portable electronic devices. For example, a device attachment may include a housing with a partial enclosure (e.g., a tub or cutout) on a rear surface thereof shaped to at least partially receive a user device, a multi-wavelength image sensor assembly disposed within the housing and configured to capture infrared image data and visible light image data, and a processing module communicatively coupled to the multi- wavelength sensor assembly and configured to transmit the infrared image data and/or the visible light image data to the user device.
The device attachment may be configured to cooperate with one or more components of an attached device such as a smartphone to capture and/or process image data. For example, an additional visible light camera on a smart phone attached to the device attachment may be used to capture additional visible light images that can be used, together with visible light images captured using a visible light image sensor in the device attachment, to measure distances to objects in a scene using the parallax of the objects between the two visible light image sensors. The measured distances can be used to align or otherwise combine infrared images from the infrared image sensor with the visible light images from the visible light imaging module. As another example, a light source in a smart phone attached to the device attachment may be operated to illuminate some or all of a scene to be imaged by imaging modules in the device attachment for use in combining infrared and visible light images.
A timer may be used to determine when a thermal imaging module in the device attachment can be used for determining calibrated temperatures of imaged objects.
The scope of the invention is defined by the claims, which are incorporated into this section by reference. A more complete understanding of embodiments of the invention will be afforded to those skilled in the art, as well as a realization of additional advantages thereof, by a consideration of the following detailed description of one or more embodiments. Reference will be made to the appended sheets of drawings that will first be described briefly.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 illustrates a front perspective view of a device attachment in accordance with an embodiment of the disclosure.
Fig. 2 illustrates a slider module of a device attachment in accordance with an embodiment of the disclosure.
Fig. 3 illustrates a rear perspective view of a device attachment in accordance with an embodiment of the disclosure.
Fig. 4 illustrates a diagram of a device attachment and an attached device showing how non-thermal image data from the device attachment and the attached device can be used in merging non-thermal and thermal image data from the device attachment in accordance with an embodiment of the disclosure.
Fig. 5 illustrates a flow diagram of various operations for using non-thermal image data from a device attachment and an attached device in merging non-thermal and thermal image data from the device attachment in accordance with an embodiment of the disclosure. Fig. 6 illustrates a flow diagram of various operations for calibrating non-thermal image data from a device attachment and an attached device for later use in merging non-thermal and thermal image data from the device attachment in accordance with an embodiment of the disclosure. Fig. 7 illustrates a flow diagram of various operations for using a time since a calibration for determining whether calibrated image-based temperatures can be determined in accordance with an embodiment of the disclosure.
Fig. 8 illustrates a flow diagram of various operations to enhance imaging of a scene in accordance with an embodiment of the disclosure. Fig. 9 illustrates a flow diagram of various operations to enhance imaging of a scene based on user input in accordance with an embodiment of the disclosure.
Embodiments of the invention and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures. DETAILED DESCRIPTION
Referring now to Figs. 1 and 2, various views are shown of a device attachment 1250 having an infrared imaging module 7000 and a non-thermal camera module 7002. Infrared image sensors such as infrared imaging module 7000 can capture images of thermal energy radiation emitted from all objects having a temperature above absolute zero, and thus can be used to produce infrared images (e.g., thermograms) that can be beneficially used in a variety of situations, including viewing in a low or no light condition, detecting body temperature anomalies in people (e.g., for detecting illness), detecting invisible gases, inspecting structures for water leaks and damaged insulation, detecting electrical and mechanical equipment for unseen damages, and other situations where true infrared images may provide useful information.
Device attachment 1250 may be configured to receive a portable electronic device such as user device 1200. In the embodiment of Fig. 1, a rear perspective view of a device attachment having a shape for receiving a device 1200 from Apple, Inc.® (e.g., iPhone™ devices, iPad™ devices, or iPod Touch™ devices) is shown. However, this is merely illustrative. If desired, device attachment 1250 may have a shape suitable for receiving devices from Samsung Electronics, Ltd.® (e.g., Galaxy Tab™ devices, other Galaxy™ devices, or other devices from Samsung) or a smart phone, tablet or portable electronic device from any other manufacturer. As shown in Fig. 1, device attachment 1250 may include a camera window 1240 through which a device camera 101 (e.g., a non-thermal camera module such as a visible light camera module) can capture images, a device light source 103 (e.g., a camera flash or flashlight) can illuminate some or all of a scene, and or one or more other sensors 105 of device 1200 can receive or emit light. Device attachment 1250 may include a plurality of imaging components such as infrared imaging module 7000 and non-thermal camera module 7002 and one or more internal electronic components such as battery 1208 or other internal components such as a processor, memory, or communications components (as examples). If desired, device attachment 1250 may also include a mechanical shutter such as a user operable shutter. The user operable shutter may be moved by a user of device attachment 1250 by sliding a button 7004 (e.g., an on/off switch) to selectively block or unblock imaging components 7000 and 7002 with an internal shutter member that is attached to button 7004.
Fig. 2 is a perspective view of a slider assembly 248 having button 7004 and a shutter member 250 with openings 252 and 254. Button 7004 may be used to push shutter member 250 along directions indicated by arrows 256 to selectively move openings 252 and 254 in front of imaging modules 7000 and 7002 of Fig. 1. When openings 252 and 254 are in front of imaging modules 7000 and 7002, imaging modules 7000 and 7002 may receive light from a scene through openings 252 and 254 for image capture operations. When button 7004 is moved so that a portion of shutter member 250 blocks imaging modules 7000 and/or 7002, light from the scene may be prevented from reaching imaging modules 7000 and/or 7002. In some embodiments, button 7004 may be configured to power device attachment 1250 on or off while moving shutter member 250 to block or unblock imaging components 7000 and 7002.
In some embodiments, shutter member 250 may be used, for example, to protect imaging components 7000 and 7002 when not in use. Shutter 250 may also be used as a temperature reference as part of a calibration process (e.g., a non-uniformity correction (NUC) process as described in U.S. Patent Application No. 14/099,818 filed December 6, 2013 which is incorporated by reference herein in its entirety, a radiometric calibration process, and/or other calibration processes ) for infrared imaging module 7000 as would be understood by one skilled in the art. Device attachment 1250 may include a front portion 7007 and a rear portion 7009. Front portion 7007 may be formed from a housing that encloses functional components of the device attachment such as a battery, connectors, imaging components, processors, memory, communications components, and/or other components of a device attachment as described herein. Rear portion 7009 may be a structural housing portion having a shape that forms a recess into which user device 1200 is configured to be releasably attached.
Fig. 3 is a front perspective view of the device attachment of Fig. 1 showing how a user device 1200 from Apple, Inc.® having a display 201 may be releasably attached to device attachment 1250 by inserting the device into a recess in a housing for the device attachment formed from a rear wall and at least one sidewall that at least partially surround the device. Device attachment 1250 may include a device connector that carries various signals and electrical power to and from user device 1200 when attached. The device connector may be disposed at a location that is suitably aligned with a corresponding device connector receptacle or socket of user device 1200, so that the device connector can engage the corresponding device connector receptacle or socket of user device 1200 when device attachment 1250 is attached to user device 1200. For example, if user device 1200 is equipped with a connector receptacle on its bottom side surface, the device connector may be positioned at an appropriate location on a bottom side wall of device attachment 1250. The device connector may also include a mechanical fixture (e.g., a locking/latched connector plug) used to support and/or align user device 1200.
The device connector may be implemented according to the connector specification associated with the type of user device 1200. For example, the device connector may implement a proprietary connector (e.g., an Apple® dock connector for iPod™ and iPhone™ such as a "Lightning" connector, a 30-pin connector, or others) or a standardized connector (e.g., various versions of Universal Serial Bus (USB) connectors, Portable Digital Media
Interface (PDMI), or other standard connectors as provided in user devices).
In one embodiment, the device connector may be interchangeably provided, so that device attachment 1250 may accommodate different types of user devices that accept different device connectors. For example, various types of device connector plugs may be provided and configured to be attached to a base connector device attachment 1250, so that a connector plug that is compatible with user device 1200 can be attached to the base connector before attaching device attachment 1250 to user device 1200. In another embodiment, the device connector may be fixedly provided.
Device attachment 1250 may also communicate with user device 1200 via a wireless connection. In this regard, device attachment 1250 may include a wireless communication module configured to facilitate wireless communication between user device 1200 and device attachment 1250. In various embodiments, a wireless communication module may support the IEEE 802.1 1 WiFi standards, the Bluetooth™ standard, the ZigBee™ standard, or other appropriate short range wireless communication standards. Thus, device attachment 1250 may be used with user device 1200 without relying on the device connector, if a connection through the device connector is not available or not desired.
Infrared imaging module 7000 may be implemented, for one or more embodiments, with a small form factor and in accordance with wafer level packaging techniques or other packaging techniques. Infrared imaging module 7000 may include a lens barrel, a housing, an infrared sensor assembly, a circuit board, a base, and a processing module. An infrared sensor assembly may include a plurality of infrared sensors (e.g., infrared detectors) implemented in an array or other fashion on a substrate and covered by a cap. For example, in one embodiment, an infrared sensor assembly may be implemented as a focal plane array (FPA). Such a focal plane array may be implemented, for example, as a vacuum package assembly. In one embodiment, an infrared sensor assembly may be implemented as a wafer level package (e.g., singulated from a set of vacuum package assemblies provided on a wafer). In one embodiment, an infrared sensor assembly may be implemented to operate using a power supply of approximately 2.4 volts, 2.5 volts, 2.8 volts, or similar voltages.
Infrared sensors in infrared imaging module 7000 may be configured to detect infrared radiation (e.g., infrared energy) from a target scene including, for example, mid wave infrared wave bands (MWIR), long wave infrared wave bands (LWIR), and/or other thermal imaging bands as may be desired in particular implementations. Infrared sensors may be implemented, for example, as microbolometers or other types of thermal imaging infrared sensors arranged in any desired array pattern to provide a plurality of pixels.
User device 1200 may be any type of portable electronic device that may be configured to communicate with device attachment 1250 to receive infrared images captured by infrared sensor assembly 7000 and/or non-thermal images such as visible light images from non-thermal imaging module 7002.
Infrared image data captured by infrared imaging module 7000 and/or non-thermal image data such as visible light image data captured by non-thermal imaging module 7002 may be provided to a processing module of device attachment 1250 and/or device 1200 for further processing.
The processing module may be configured to perform appropriate processing of captured infrared image data, and transmit raw and/or processed infrared image data to user device 1200. For example, when device attachment 1250 is attached to user device 1200, a processing module may transmit raw and/or processed infrared image data to user device 1200 via a wired device connector or wirelessly via appropriate wireless components further described herein. Thus, for example, user device 1200 may be appropriately configured to receive the infrared image data (e.g., thermal image data) and/or non-thermal image data from device attachment 1250 to display user-viewable infrared images (e.g., thermograms) to users on display 201 and permit users to store infrared image data non-thermal image data, multi-wavelength image data, and/or user-viewable infrared images. That is, user device 1200 may be configured to run appropriate software instructions (e.g., a smart phone "app") to function as an infrared camera that permits users to frame and take infrared, non-infrared, and/or combined still images, videos, or both. Device attachment 1250 and user device 1200 may be configured to perform other infrared imaging functionalities, such as storing and/or analyzing thermographic data (e.g., temperature information) contained within infrared image data.
Device attachment 1250 may also include a battery 1208 (see, e.g., Fig. 1). Battery 1208 may be configured to be used as a power source for internal components of device attachment 1250, so that device attachment 1250 does not drain the battery of user device 1200 when attached. Further, battery 1208 of device attachment 1250 may be configured to provide electrical power to user device 1200, for example, through a device connector. Thus, the battery 1208 may beneficially provide a backup power for user device 1200 to run and charge from. Conversely, various components of device attachment 1250 may be configured to use electrical power from a battery of user device 1200 (e.g., through a device connector), if a user desires to use functionalities of device attachment 1250 even when the battery of device attachment 1250 is drained. In some embodiments, a non-thermal camera module 101 of device 1200 may be used together with non-thermal camera module 7002 of device attachment 1250. When blending infrared (e.g., thermal) and non-thermal (e.g., visible) video images, the two images may be mapped to each other pixel by pixel. Differences between the two cameras (e.g., distortion, parallax, pointing angle, etc.) can be compensated. Imaging modules 7000 and 7002 may be mounted close to each other to reduce parallax differences between images captured with the imaging modules. In order to provide corrections for any remaining parallax differences, particularly for very nearby objects in an image, non-thermal camera 101 in the device 1200 can be used in conjunction with non-thermal camera module 7002 to determine the distance to the objects in a scene. The determined distance can then be used to adjust the alignment of infrared (e.g., thermal) and non-thermal (e.g., visible) video images even at variable scene distances.
As shown in Fig. 4, non-thermal camera module 7002 and non-thermal camera module 101 can each provide a non-thermal (e.g., visible) image of a scene to processing circuitry such as distance measure engine 301. Distance measure engine 301 can determine the distance to scene objects using the known distance D between non-thermal camera module 7002 and non-thermal camera module 101 and a measured shift in position of the scene objects in the images provided by non-thermal camera module 7002 and non-thermal camera module 101.
The measured distance, the non-thermal image captured by non-thermal imaging module 7002, and a thermal image (e.g., an infrared image) from thermal imaging module 7000 can be provided to processing circuitry such as merge engine 303. Merge engine 303 can use the measured distance to correct any remaining parallax differences between the thermal image and the non-thermal image so that the thermal image and the non-thermal image can be combined and provided to display 201 for display to a user. Distance measure engine 301 and merge engine 303 may represent algorithms performed by a logic device (e.g., a programmable logic device or microprocessor).
Fig. 5 is a flow chart of illustrative operations for using a non-thermal imaging module in a device attachment and a non-thermal imaging module in an attached device to provide a parallax correction for images captured by the non-thermal imaging module in a device attachment and a thermal imaging module in the device attachment. At block 400, a first non-thermal image may be captured using a non-thermal image sensor in the device attachment and, optionally, a thermal image may be captured using a thermal image sensor in the device attachment.
At block 402, a second non-thermal image may be captured using a non-thermal image sensor in a device camera.
At block 404, a distance may be determined to a scene object using the first and second non-thermal images (e.g., by determining a parallax-induced shift of the object in the first and second non-thermal images and triangulating the distance to the object using the determined shift and the known relative locations of the non-thermal image sensor in the device attachment and the non-thermal image sensor in the device). The known relative locations may be determined based on the known positions of the non-thermal image sensors in each respective device and the known position of the device within the device attachment and/or based on a calibration operation performed by capturing an image of an object at a known distance using both of the non-thermal image sensors and determining the relative locations of the non-thermal image sensors using the images of the object and the known distance.
In some embodiments, the capturing of the non-thermal images may be controlled to improve accuracy of determining a parallax-induced shift between the first and second non-thermal images, which in turn would improve the accuracy of the determined distance and the parallax correction. For example, if the first and second non-thermal images are captured while the user and/or the object are in motion, the accuracy of determining a parallax-induced shift may be affected due to a shift and/or blurring of the objects in the images caused by the motion. Such a motion-induced shift may occur, for example, if the timing of the capturing by the first and .second non-thermal image sensors is not adequately synchronized.
Thus, in one embodiment, operations of Fig. 5 may involve detecting a movement of the device and/or device attachment (e.g., by an accelerometer or other types of motion detector provided in the device and/or the device attachment) and/or detecting a movement of a target object in the scene (e.g., by processing captured images as would be understood by one skilled in the art). In this embodiment, the non-thermal images may be captured when the detected movement is below a desired threshold and/or the captured images synchronized to obtain non-thermal images less affected by motion. In another embodiment, operations of Fig. 5 may involve capturing multiple frames of non-thermal images by the first and second non-thermal image sensors, respectively, while operating a light source (e.g., light source 103 of user device 1200) to flash (e.g., illuminate for a short period of time) all or some of the scene. The frames captured by the first non-thermal image sensor may be processed to detect and select a frame containing an image of the flashed scene. Similarly, a frame containing an image of the flashed scene (e.g., a frame at the start, middle, or end of the flash event) may be detected and selected. In this way, for example, the selected frames may be substantially synchronized to the moment (or some moment in time) when the scene was illuminated, thereby reducing the effects, if any, of a motion-induced shift (e.g., achieve sufficient synchronization of the captured images).
At block 406, the thermal image and the first non-thermal image may be combined using the determined distance to the object (e.g., by performing a parallax correction between the thermal image and the first non-thermal image using the determined distance).
In order to improve the parallax corrections determined using the non-thermal camera module in the device attachment and the non-thermal camera module in the device, any distortion and alignment error between the non-thermal camera module in the device attachment and the non-thermal camera module in the device can be calibrated. For example, an image may be captured of an object such as a hand in front of a thermally and visually uniform background using the non-thermal camera module in the device attachment, the thermal imaging module in the device attachment, and the non-thermal camera module in the device. Processing circuitry (e.g., a smartphone app running on the device processor) can be used to match the edges of the hand in all three images and correlate the alignment between two non-thermal camera modules to a factory calibrated alignment between the non-thermal camera module in the device attachment and the thermal imaging module in the device attachment.
Fig. 6 is a flowchart of operations for calibrating a distortion and/or alignment between a non-thermal camera module in the device attachment and a non-thermal camera module in the device.
At block 500, images may be captured of an object (e.g., a hand) at a common time using each of a thermal image sensor in a device attachment, a non-thermal image sensor in the device attachment, and a non-thermal image sensor in an attached device. At block 502, edges of the object in each captured image may be detected.
At block 504, alignment and distortion corrections between the non-thermal image sensor in the device attachment and the non-thermal image sensor in the attached device may be determined based on the locations in the images of the detected edges. At block 506, the alignment and distortion corrections may be stored (e.g., in the device attachment or the device) for use in distance measurements for parallax corrections between images captured using the thermal image sensor and the non-thermal image sensor in the device attachment.
While various embodiments illustrated above with reference to Figs. 5 and 6 were described in relation to utilizing a non-thermal image sensor in a user device (e.g., a phone camera) and a thermal image sensor and a non-thermal image sensor in a device attachment, it is also contemplated that the principles and spirit of the present disclosure may be applied to any other appropriate combination of image sensors in the user device and/or the device attachment. For example, in case a thermal image sensor may additionally or alternatively be provided in a user device, a non-thermal image sensor of the user device and a non-thermal image sensor of a device attachment for the user device may be utilized to provide a parallax correction between the thermal image sensor of the user device and the non-thermal image sensor of either the user device or the device attachment. In another example, for a user device that may include two or more non-thermal image sensors (e.g., for stereoscopic imaging or other purposes), the non-thermal image sensors of the user device may be utilized to provide parallax correction for image sensors of the device (and/or the device attachment if present).
In some embodiments, thermal imaging module 7000 may be used to determine an image-based calibrated temperature of an object (e.g., by capturing one or more calibrated thermal images and determining from the intensity and/or spectrum of the object in the thermal images, the temperature of the object as would be understood by one skilled in the art). The accuracy of this type of image-based temperature measurement can be improved by ensuring that the thermal imaging module has been recently calibrated when an image-based temperature measurement is to be made.
Fig. 7 is a flow chart of illustrative operations for ensuring that the thermal imaging module has been recently calibrated when an image-based temperature measurement is to be made. At block 600, a system such as a system that includes a device attachment having a thermal image sensor and an attached device may perform a calibration of a thermal image sensor such as a thermal image sensor in a device attachment using a closed shutter (e.g., by closing the shutter and capturing one or more images of the shutter using the thermal image sensor).
At block 602, the system may monitor the time since the last calibration of the thermal image sensor (e.g., by a processor in the device attachment or a processor in an attached device).
At block 604, the system may receive a request for an image -based temperature determination from a user.
At block 606, the system may determine whether the time since calibration is less than a maximum allowable time using the monitored time. The maximum allowable time may be, as examples, less than 20 seconds since the last calibration, less than 10 seconds since the last calibration, less than one minute since the last calibration or less than 30 seconds since the last calibration. In response to determining that the time since the last calibration is less than the maximum allowable time, the system may proceed to block 608.
At block 608, one or more thermal images and/or an infrared spectrum of an object may be captured.
At block 610, the system may determine the temperature of the object from thermal images and/or the infrared spectrum.
If it is determined at block 606 that the time since the last calibration is greater than the maximum allowable time, the system may proceed to block 612.
At block 612, the system may instruct user to perform a new calibration of the thermal imaging module using the closed shutter to ensure that the subsequent temperature measurement is accurate.
In some embodiments, a light source in a portable electronic device that is attached to a device attachment having a thermal imaging module may be used in cooperation with the thermal imaging module and a non-thermal imaging module to enhance imaging of a scene. For example, light source 103 of device 1200 (see FIG. 1) may be used to illuminate at least a portion of a scene in a spectrum sensed by one or more of imaging modules 7000, 7002, and/or 101. Light source 103 can be flashed or operated in a flashlight mode to illuminate some or all of the scene during image capture operates using imaging modules 7000 and 7002. Light source 103 may be turned on and/or flashed in response to user input or may be automatically turned on and/or flashed based on, for example, a light level determined using imaging module 7002, device camera 101 , or other light sensor.
Fig. 8 illustrates a flow diagram of various operations to enhance imaging of a scene using thermal images and active illumination of the scene according to an embodiment.
At step 800, thermal image data may be captured using a thermal image sensor in a device attachment and non-thermal image data may be captured using a non-thermal image sensor in the device attachment. If desired, additional non-thermal image data may be captured using a camera in an attached device.
At step 802, while capturing the thermal image data and the non-thermal image data using the device attachment, a light source of an attached device may be operated. The light source may be operated (e.g., flashed or held on) during image capture operations based on, for example, user input and/or automatically determined light levels. Illuminating the scene using the light source may enhance the non-thermal images captured by the non-thermal image sensor in the device attachment.
At step 804, the captured thermal image data and the captured non-thermal image data from the device attachment may be combined to form an enhanced output image that includes some or all of the thermal image data and actively illuminated non-thermal image data. In some embodiments, thermal and non-thermal images may be processed to generate combined images using high contrast processing.
Regarding high contrast processing, high spatial frequency content may be obtained from one or more of the thermal and non-thermal images (e.g., by performing high pass filtering, difference imaging, and/or other techniques). A combined image may include a radiometric component of a thermal image and a blended component including infrared (e.g., thermal) characteristics of a scene blended with the high spatial frequency content, according to a blending parameter, which may be adjustable by a user and/or machine in some embodiments. In some embodiments, high spatial frequency content from non-thermal images may be blended with thennal images by superimposing the high spatial frequency content onto the thermal images, where the high spatial frequency content replaces or overwrites those portions of the thermal images corresponding to where the high spatial frequency content exists. For example, the high spatial frequency content may include edges of objects depicted in images of a scene, but may not exist within the interior of such objects. In such
embodiments, blended image data may simply include the high spatial frequency content, which may subsequently be encoded into one or more components of combined images.
For example, a radiometric component of thermal image may be a chrominance component of the thermal image, and the high spatial frequency content may be derived from the luminance and/or chrominance components of a non-thermal image. In this embodiment, a combined image may include the radiometric component (e.g., the chrominance component of the thermal image) encoded into a chrominance component of the combined image and the high spatial frequency content directly encoded (e.g., as blended image data but with no thermal image contribution) into a luminance component of the combined image. By doing so, a radiometric calibration of the radiometric component of the thermal image may be retained. In similar embodiments, blended image data may include the high spatial frequency content added to a luminance component of the thermal images, and the resulting blended data encoded into a luminance component of resulting combined images. The non-thermal image may be from any type of non-thermal imager, including for example a visible light imager, a low light visible ligh imager, a CCD imaging device, an EMCCD imaging device, a CMOS imaging device, a sCMOS imaging device, a NIR imaging device, a S WIR imaging device, or other types of non-thermal imagers (e.g., including passive or active illumination as would be understood by one skilled in the art).
For example, any of the techniques disclosed in the following applications may be used in various embodiments: U.S. Patent Application No. 12/477,828 filed June 3, 2009; U.S. Patent Application No. 12/766,739 filed April 23, 2010; U.S. Patent Application No.
13/105,765 filed May 11, 201 1 ; U.S. Patent Application No. 13/437,645 filed April 2, 2012; U.S. Provisional Patent Application No. 61/473,207 filed April 8, 2011 ; U.S. Provisional Patent Application No. 61/746,069 filed December 26, 2012; U.S. Provisional Patent Application No. 61/746,074 filed December 26, 2012; U.S. Provisional Patent Application No. 61/748,018 filed December 31, 2012; U.S. Provisional Patent Application No. 61/792,582 filed
March 15, 2013; U.S. Provisional Patent Application No. 61/793,952 filed March 15, 2013; and International Patent Application No. PCT/EP2011/056432 filed April 21, 2011, all of such applications are incorporated herein by reference in their entirety. Any of the techniques described herein, or described in other applications or patents referenced herein, may be applied to any of the various thermal devices, non-thermal devices, and uses described herein.
In some embodiments, any one of device attachment 1250 or device 1200 may be configured to receive user input indicating a portion of interest to be imaged by a first imaging module (e.g., infrared imaging module 7000), control the light source 103 to illuminate at least the portion-of-interest in a spectrum sensed by a second imaging module (e.g., visible spectrum imaging module 7002 and/or 101), receive illuminated captured images of the
portion-of-interest from the second imaging module, and generate a combined image comprising illuminated characteristics of the scene derived from the illuminated captured images. In some embodiments, a thermal image may be used to detect a "hot" spot in an image, such as an image of a circuit breaker box. Light source 103 may be used to illuminate a label of a circuit breaker to provide a better image and potentially pin point the cause of the hot spot.
Fig. 9 illustrates a flow diagram of various operations to enhance imaging of a scene based on user input in accordance with an embodiment of the disclosure. For example, one or more portions of process 5800 may be performed by device attachment 1250, device 1200 and/or each of imaging modules 7000 and/or 7002 and utilizing any of the components described and/or enumerated therein. It should be appreciated that any step, sub-step, sub-process, or block of process 5800 may be performed in an order or arrangement different from the embodiment illustrated by Fig. 9.
In some embodiments, any portion of process 5800 may be implemented in a loop so as to continuously operate on a series of infrared and/or visible spectrum images, such as a video of a scene. In other embodiments, process 5800 may be implemented in a partial feedback loop including display of intermediary processing (e.g., after or while receiving infrared and/or visible spectrum images, registering images to each other, generating illuminated and/or combined images, or performing other processing of process 5800) to a user, for example, and/or including receiving user input, such as user input directed to any intermediary processing step. Further, in some embodiments, process 5800 may include one or more steps, sub-steps, sub-processes, or blocks of any of the other processes described herein. At block 5810, device attachment 1250 generates visible spectrum images of a scene.
For example, imaging module 7002 may be configured to generate one or more visible spectrum images of a scene. In some embodiments, block 5810 may include one or more operations discussed with reference to the processes of Figs. 5-8. If desired, a device camera such as camera 101 may also capture visible spectrum images.
At block 5812, optionally at the same time as block 5810, device attachment 1250 generates infrared images of the scene. For example, imaging modules 7000 may be configured to generate one or more infrared images of the scene. In some embodiments, block 5812 may include one or more operations discussed with reference to the processes of Figs. 5-8.
At block 5820, device attachment 1250 produces an output signal of data corresponding to the generated images. For example, any one of imaging modules 7000 or 7002 and/or a processor may be adapted to produce an output signal of data corresponding to the images generated in blocks 5810 and 5812. In some embodiments, the output signal may adhere to a particular interface standard, for example, such as MIPI®.
At block 5830, device attachment 1250 and/or device 1200 stores the data according to a common data format. For example, the data may be stored in a desired data file according to a common data format.
At block 5840, device attachment 1250 and/or device 1200 registers the images to each other. For example, device attachment 1250 and/or device 1200 may be adapted to register any one of the generated images to another one of the generated images by performing one or more of interpolation, scaling, cropping, rotational transformation, morphing, and/or filtering operations on one or more of the images to substantially match spatial content within the images. In some embodiments, device attachment 1250 and/or device 1200 may be adapted to register images to each other using one or more of the processes described in connection with Figs. 4-6.
At block 5850, device attachment 1250 and/or device 1200 receives user input indicating a portion-of-interest of the scene. For example, device attachment 1250 and/or device 1200 may be adapted to receive user input provided by one or more other components, a touchscreen display, and/or other devices indicating a portion-of-interest of the already imaged scene. The user input may be used to designate a pixel or group of pixels corresponding to the portion-of-interest. In some embodiments, the user input may be combined with the selection of registration operations performed in block 5840 to determine corresponding pixels in a variety of captured images.
At block 5852, a light source such as light source 103 of device 1200 illuminates the portion of interest. For example, any one of device attachment 1250 and/or device 1200 may be adapted to control light source 103 to illuminate all or a designated portion of interest in a particular scene. In some embodiments, a particular spectrum and/or portion of a scene may be selected by controlling a MEMS lens and/or other system coupled or otherwise associated with an illumination module.
At block 5854, device attachment 1250 and/or device 1200 generates illuminated images of the portion-of- interest. For example, any one of imaging modules 7000, 7002, or 101 sensitive to the spectrum illuminated in block 5852 may be adapted to generate an illuminated image that is captured while light source 103 is illuminating at least the portion-of-interest designated in block 5850.
At block 5860, device attachment 1250 and/or device 1200 generates combined images of the scene from the visible spectrum images, the infrared images, and/or the illuminated images. In one embodiment, a combined image may include a visible spectrum image with embedded data corresponding to infrared image data for each pixel of visible spectrum data. When such a combined image is displayed, a user may select a pixel or group of pixels with a user interface and text corresponding to the infrared image data may be displayed alongside the visible spectrum image, such as in a text box or legend, for example. In some embodiments, any one of imaging modules 7000, 7002, and/or 101 may be adapted to generate combined images using one or more of the processes described herein, including processes described in connection with Figs. 5-8.
At block 5870, device 1200 displays one or more of the generated images. For example, device 1200 may be adapted to use a display (e.g., display 201 in Fig. 2) to display one or more of the images generated in process 5800. In some embodiments, block 5870 may include one or more operations discussed with reference to processes of Figs. 4-6.
Embodiments described above illustrate but do not limit the invention. It should also be understood that numerous modifications and variations are possible in accordance with the principles of the invention. Accordingly, the scope of the invention is defined only by the following claims. While various embodiments illustrated herein are described in relation to a device attachment, it should be understood that one or more embodiments of the invention are applicable also to the device solely or in conjunction with the device attachment. For example, the thermal image sensor may be implemented directly into the device (e.g., device 1200) and also optionally the additional non-thermal image sensor may be implemented within the device. Consequently, the principles taught herein may be applied based on the sensors implemented within the device.
While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.

Claims

CLAIMS What is claimed is:
1. A system comprising: a device attachment having a thermal imaging module and a non-thermal imaging module; and a processor configured to determine a parallax correction for the thermal imaging module and the non-thermal imaging module based on a first non-thermal image from the non-thermal imaging module and a second non-thermal image from a camera of a device attached to the device attachment.
2. The system of claim 1, wherein the processor is configured to: determine a distance to an object using the first non-thermal image from the non-thermal imaging module and the second non-thermal image from the camera; and determine the parallax correction for the thermal imaging module and the non-thermal imaging module based on the determined distance.
3. The system of claim 2, wherein the processor is configured to determine the distance based on a parallax-induced shift of the object in the first and second non-thermal images and based on a known distance between the non-thermal imaging module of the device attachment and the camera of the device.
4. The system of claim 3, wherein the processor comprises a processor of the device attachment.
5. The system of claim 3, further comprising the device attached to the device attachment.
6. The system of claim 5, wherein the processor comprises a processor of the device.
7. The system of claim 6, wherein the device comprises a mobile phone.
8. The system of claim 1, wherein the processor is configured to combine a thermal image from the thermal imaging module with the first non-thermal image using the determined parallax correction.
9. A method, comprising: capturing a first non-thermal image using a camera in a portable electronic device; capturing a second non-thermal image using a non-thermal imaging module in a device attachment that is attached to the portable electronic device; determining a distance to an object using the first and second non-thermal images; capturing a thermal image using a thermal imaging module in the device attachment; performing a parallax correction on at least one of the thermal image or the first non-thermal image based on the determining; and combining, following the performing, the thermal image and the second non-thermal image to form an enhanced output image.
10. The method of claim 9, further comprising providing power from a battery in the device attachment to the portable electronic device.
11. The method of claim 9, further comprising: operating a light source of the portable electronic device; and synchronizing the capturing of the first and second non-thermal images based on the operating.
12. The method of claim 9, further comprising: monitoring a movement of the portable electronic device and the device attachment; and capturing the first and second non-thermal images when the movement is below a desired threshold.
13. The method of claim 9, further comprising: capturing additional images of an additional object using the camera, the non-thermal imaging module, and the thermal imaging module; detecting edges of the additional object in each of the additional images; and determining alignment and distortion corrections between the camera and the non-thermal imaging module for use in the performing of the parallax correction.
14. A method, comprising: performing a calibration operation for a thermal imaging module in a device attachment that is attached to a user device; receiving an image-based temperature determination request from a user; determining whether a time since the performing is less than a maximum allowed time; and capturing a thermal image using the thermal imaging module if it is determined that the time since the performing is less than the maximum allowed time.
15. The method of claim ] 4, wherein the performing comprises capturing one or more images of a closed shutter of the device attachment using the thermal imaging module.
16. The method of claim 15, further comprising: if it is determined that the time since the performing is greater than the maximum allowed time, providing instructions to the user to close the shutter to perform a new calibration of the thermal imaging module.
17. A system comprising: a device attachment comprising: a housing; an infrared sensor assembly within the housing, the infrared sensor assembly configured to capture infrared image data; a non-thermal camera module within the housing, the non-thermal camera module configured to capture non-thermal image data; and a processing module communicatively coupled to the infrared sensor assembly and the non-thermal camera module; and a user device releasably attached to the housing of the device attachment, the user device comprising: a non-thermal camera module; and a light source, wherein the processing module is configured to cooperate with the infrared sensor assembly, the non-thermal camera module in the device attachment and at least one of the non-thermal camera module in the user device or the light source to capture and process images.
18. The system of claim 17, wherein the processing module is configured to operate the light source while the infrared image data and non-thermal image data are captured to cooperate with the infrared sensor assembly, the non-thermal camera module in the device attachment and the light source to capture and process the images.
19. The system of claim 18, wherein the processing module is configured to combine the infrared image data and the non-thermal image data captured while the light source was operated to form an enhanced output image.
20. The system of claim 17, wherein the processing module is configured to: receive the infrared image data, the non-thermal image data, and additional non-thermal image data from the non-thermal camera module in the user device; and combine the infrared image data and the non-thermal image data based on a correction determined using the non-thermal image data and the additional non-thermal image data to cooperate with the infrared sensor assembly, the non-thermal camera module in the device attachment and the non-thermal camera module in the user device to capture and process the images.
PCT/US2014/073096 2013-12-31 2014-12-31 Device attachment with dual band imaging sensor WO2015103446A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201480076762.8A CN106068446B (en) 2013-12-31 2014-12-31 Equipment appurtenance with dual-waveband imaging sensor
KR1020167021120A KR102418369B1 (en) 2013-12-31 2014-12-31 Device attachment with dual band imaging sensor
US15/199,867 US11297264B2 (en) 2014-01-05 2016-06-30 Device attachment with dual band imaging sensor

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
USPCT/US2013/078551 2013-12-31
PCT/US2013/078551 WO2014106276A2 (en) 2012-12-31 2013-12-31 Infrared imaging device having a shutter
US201461923732P 2014-01-05 2014-01-05
US61/923,732 2014-01-05
US14/246,006 2014-04-04
US14/246,006 US9674458B2 (en) 2009-06-03 2014-04-04 Smart surveillance camera systems and methods
US14/281,883 US9900478B2 (en) 2003-09-04 2014-05-19 Device attachment with infrared imaging sensor
US14/281,883 2014-05-19
US14/299,987 2014-06-09
US14/299,987 US9083897B2 (en) 2009-06-03 2014-06-09 Infrared camera systems and methods for dual sensor applications
PCT/US2014/059200 WO2015051344A1 (en) 2013-10-03 2014-10-03 Durable compact multisensor observation devices
USPCT/US2014/059200 2014-10-03

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/199,867 Continuation US11297264B2 (en) 2014-01-05 2016-06-30 Device attachment with dual band imaging sensor

Publications (2)

Publication Number Publication Date
WO2015103446A2 true WO2015103446A2 (en) 2015-07-09
WO2015103446A3 WO2015103446A3 (en) 2015-08-27

Family

ID=53494229

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/073096 WO2015103446A2 (en) 2013-12-31 2014-12-31 Device attachment with dual band imaging sensor

Country Status (3)

Country Link
KR (1) KR102418369B1 (en)
CN (1) CN106068446B (en)
WO (1) WO2015103446A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160262631A1 (en) * 2015-03-12 2016-09-15 Ein-Yiao Shen Handset mobile communication device for measuring body temperature and body temprature measuring method thereof

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102525937B1 (en) * 2018-03-20 2023-04-28 삼성전자주식회사 The electronic device comprising a pluraliaty of light sources
CN111193821B (en) * 2020-03-03 2021-04-13 覃立 Mobile phone with infrared imaging temperature measurement function and temperature measurement method thereof

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6633231B1 (en) * 1999-06-07 2003-10-14 Horiba, Ltd. Communication device and auxiliary device for communication
JP2005338359A (en) * 2004-05-26 2005-12-08 Constec Engi Co Imaging unit
US7820967B2 (en) * 2007-09-11 2010-10-26 Electrophysics Corp. Infrared camera for locating a target using at least one shaped light source
JP2010117587A (en) * 2008-11-13 2010-05-27 Ftc:Kk Camera system for construction work, and stroboscopic device for cellular phone with camera
US9843743B2 (en) * 2009-06-03 2017-12-12 Flir Systems, Inc. Infant monitoring systems and methods using thermal imaging
US9235023B2 (en) * 2011-06-10 2016-01-12 Flir Systems, Inc. Variable lens sleeve spacer
JP2013235532A (en) * 2012-05-11 2013-11-21 Azone Co Ltd Terminal adapter
CN205449295U (en) * 2012-12-26 2016-08-10 菲力尔系统公司 Device annex
WO2014159758A1 (en) * 2013-03-14 2014-10-02 Drs Rsta, Inc. System architecture for thermal imaging and thermography cameras

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160262631A1 (en) * 2015-03-12 2016-09-15 Ein-Yiao Shen Handset mobile communication device for measuring body temperature and body temprature measuring method thereof

Also Published As

Publication number Publication date
WO2015103446A3 (en) 2015-08-27
KR102418369B1 (en) 2022-07-08
KR20160105513A (en) 2016-09-06
CN106068446A (en) 2016-11-02
CN106068446B (en) 2019-10-25

Similar Documents

Publication Publication Date Title
US11297264B2 (en) Device attachment with dual band imaging sensor
US10321031B2 (en) Device attachment with infrared imaging sensor
US9986175B2 (en) Device attachment with infrared imaging sensor
CN205449295U (en) Device annex
KR102157675B1 (en) Image photographing apparatus and methods for photographing image thereof
US9684963B2 (en) Systems and methods for dynamic registration of multimodal images
US9407837B2 (en) Depth sensor using modulated light projector and image sensor with color and IR sensing
US9282259B2 (en) Camera and method for thermal image noise reduction using post processing techniques
EP3404401B1 (en) Optical gas imaging systems and methods
JP6679577B2 (en) Image generation apparatus and method for generating 3D panoramic image
US9357127B2 (en) System for auto-HDR capture decision making
US20130258111A1 (en) Device attachment with infrared imaging sensor
JP6635382B2 (en) Image output device, image output method, and image output system
US20140267757A1 (en) Parallax correction in thermal imaging cameras
KR20130142810A (en) Thermal imaging camera module, smart phone application and smart phone
WO2015130226A1 (en) Image sensor modules including primary high-resolution imagers and secondary imagers
US20160074724A1 (en) Thermal-assisted golf rangefinder systems and methods
US10298858B2 (en) Methods to combine radiation-based temperature sensor and inertial sensor and/or camera output in a handheld/mobile device
JP6244061B2 (en) Distance image acquisition device and distance image acquisition method
JP6383439B2 (en) Method and system for calibrating a sensor using a recognized object
US20160316119A1 (en) Techniques for device attachment with dual band imaging sensor
KR102418369B1 (en) Device attachment with dual band imaging sensor
KR100766995B1 (en) 3 dimension camera module device
WO2015103448A2 (en) Techniques for device attachment with dual band imaging sensor
CN114365475B (en) Camera with reconfigurable lens assembly

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14828430

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20167021120

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 14828430

Country of ref document: EP

Kind code of ref document: A2