WO2014172033A1 - Estimating bilirubin levels - Google Patents

Estimating bilirubin levels Download PDF

Info

Publication number
WO2014172033A1
WO2014172033A1 PCT/US2014/024761 US2014024761W WO2014172033A1 WO 2014172033 A1 WO2014172033 A1 WO 2014172033A1 US 2014024761 W US2014024761 W US 2014024761W WO 2014172033 A1 WO2014172033 A1 WO 2014172033A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
image data
region
skin
bilirubin
Prior art date
Application number
PCT/US2014/024761
Other languages
French (fr)
Inventor
James A. Taylor
Shwetak N. Patel
James W. STOUT
Lilian DE GREEF
Mayank Goel
Eric C. Larson
Original Assignee
University Of Washington Through Its Center For Commercialization
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University Of Washington Through Its Center For Commercialization filed Critical University Of Washington Through Its Center For Commercialization
Priority to EP14784645.5A priority Critical patent/EP2967359A4/en
Priority to JP2016501634A priority patent/JP6545658B2/en
Priority to KR1020157028278A priority patent/KR102237583B1/en
Publication of WO2014172033A1 publication Critical patent/WO2014172033A1/en
Priority to US14/835,348 priority patent/US10285624B2/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1032Determining colour for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1032Determining colour for diagnostic purposes
    • A61B5/1034Determining colour for diagnostic purposes by means of colour cards
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/443Evaluating skin constituents, e.g. elastin, melanin, water
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/04Babies, e.g. for SIDS detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • a mobile device is used to capture image data of a patient's skin and a color calibration target.
  • the image data is processed to generate an estimation of the bilirubin level.
  • the image processing can include transforming the image data into a plurality of different color spaces to facilitate assessment of the overall yellowness of the skin while compensating for color differences caused by lighting, skin tone, and other potentially confounding factors.
  • a method for estimating the level of bilirubin in a patient.
  • the method includes receiving image data for at least one image including a region of the patient's skin and a color calibration target. Color-balanced image data for the skin region is generated based on a subset of the image data corresponding to the color calibration target and the skin region.
  • the bilirubin level in the patient is estimated based on the color-balanced image data for the skin region.
  • the image data can be obtained with any suitable imaging device.
  • the imaging device can collect image data independently of additional attachments or equipment, such as external lenses, filters, or other specialized hardware.
  • the bilirubin level can be estimated using only image data of the patient's skin color at a particular point in time, or by comparing the skin color image data to baseline skin color image data.
  • the method can further include receiving baseline skin color data for the patient corresponding to when the patient has a reference bilirubin level (e.g., approximately zero when baseline data is obtained within 24 hours of birth).
  • the bilirubin level can be estimated based on one or more differences between the baseline skin color data and the color-balanced image data for the skin region.
  • the baseline skin color data for the patient can be generated by capturing baseline image data for the patient when the patient has the reference bilirubin level.
  • the baseline image data can correspond to at least one image including the skin region and a baseline color calibration target.
  • Color-balanced baseline image data for the skin region can be generated based on a subset of the baseline image data corresponding to the baseline color calibration target and the skin region.
  • the baseline skin color data can be generated based on the color-balanced baseline image data for the skin region.
  • a standardized color calibration target can be used to facilitate the color balancing process.
  • the color calibration target can include a plurality of standardized color regions, including a white color region.
  • the standardized color regions can include a black region, a gray region, a light brown region, a cyan region, a magenta region, a yellow region, and a dark brown region.
  • the color calibration target can at least partially define an opening configured to expose the skin region to permit capturing of image data for the skin region.
  • the standardized color regions can be disposed in a known arrangement surrounding the opening.
  • the process for generating color-balanced image data for the skin region can include processing the received image data to identify a subset of the image data corresponding to the exposed skin region and a subset of the image data corresponding to the white color region.
  • the white color region data can be processed to determine observed color values for the white color region.
  • Color-balanced image data for the exposed skin region can be generated based on the observed color values for the white color region.
  • the observed color values for the white color region can include any suitable color space values, such as red, green, blue (RGB) color space values.
  • the image data can be transformed into a plurality of different color spaces in order to detect yellow discoloration of the skin.
  • the color-balanced image data for the skin region can include RGB color space data
  • a method of estimating the level of bilirubin in a patient can further include transforming the RGB color space data into at least one other color space to generate color-balanced image data for the exposed skin region for the at least one other color space.
  • the at least one color space can include: (a) a cyan, magenta, yellow, and black (CMYK) color space; (b) a YCbCr color space; and/or (c) a Lab color space.
  • a plurality of chromatic and achromatic features can be generated based on the image data.
  • the received image data can include an image obtained using flash illumination and an image obtained without using flash illumination.
  • Estimating the bilirubin level can include processing a plurality of normalized chromatic and achromatic features to select a first estimated range of the bilirubin level from one of a plurality of different bilirubin ranges.
  • the features can be processed using an approach based on the selected first estimated range of the bilirubin level to generate a final estimate of the bilirubin level.
  • the plurality of different bilirubin ranges can include a low range, a medium range, and a high range.
  • the plurality of features can include selected color values of the skin region for a plurality of different color spaces. In some instances, the plurality of features can include a calculation of a color gradient across the skin region.
  • Estimation of the bilirubin level can involve performing one or more regressions.
  • processing the features to select a first estimated range of the bilirubin level can include performing a series of regressions, including at least one of: (a) a linear regression, (b) an encapsulated k-Nearest Neighbor regression, (c) a lasso regression, (d) a LARS regression, (e) an elastic net regression, (f) a support vector regression using a linear kernel, (g) a support vector regression assigning higher weight to higher-rated bilirubin values, and (h) a random forest regression.
  • Processing the features to select a first estimated range of the bilirubin levels can include performing a series of regressions, including: (a) a linear regression, (b) an encapsulated k-Nearest Neighbor regression, (c) a lasso regression, (d) a LARS regression, (e) an elastic net regression, (f) a support vector regression using a linear kernel, (g) a support vector regression assigning higher weight to higher-rated bilirubin values, and (h) a random forest regression.
  • regressions including: (a) a linear regression, (b) an encapsulated k-Nearest Neighbor regression, (c) a lasso regression, (d) a LARS regression, (e) an elastic net regression, (f) a support vector regression using a linear kernel, (g) a support vector regression assigning higher weight to higher-rated bilirubin values, and (h) a random forest regression.
  • Using a processing approach based on the selected first estimated range of the bilirubin level can include performing a final random forest regression that uses the plurality of normalized chromatic and achromatic features and the selected first estimated range of the bilirubin level as the features for the final random forest regression.
  • estimating the bilirubin level includes determining color space value for the patient's skin and using a processing approach based on the determined color space value to estimate the bilirubin level.
  • the regression equations can also include features from the baseline image (e.g., color space values from one or more color spaces).
  • a mobile device configured to estimate the level of bilirubin in a patient.
  • the device includes a camera operable to capture image data for a field of view, a processor operatively coupled with the camera, and a data storage device operatively coupled with the processor.
  • the data storage device can store instructions that, when executed by the processor, cause the processor to receive image data for an image captured by the camera, the image including a region of the patient's skin and a color calibration target.
  • the instructions can cause the processor to generate color-balanced image data for the skin region based on a subset of the image data corresponding to the color calibration target and the skin region, and estimate the bilirubin level in the patient based on the color-balanced image data for the skin region.
  • the mobile device can be used to estimate the bilirubin level independently of any external attachments to the mobile device (e.g., lenses, filters) or any other specialized mobile device equipment.
  • the color calibration target can at least partially define an opening configured to expose the skin region to permit capturing of image data for the skin region, and can include a plurality of standardized color regions including a white color region.
  • the instructions can cause the processor to process the received image data to identify a subset of the image data corresponding to the exposed skin region and a subset of the image data corresponding to the white color region.
  • the white color region data can be processed to determine observed color values for the white color region.
  • Color-balanced RGB image data can be generated for the exposed skin region based on the observed color values for the white color region.
  • Color-balanced image data for the exposed skin region can be generated for at least one other color space by transforming the color- balanced RGB image data into the at least one other color space.
  • a plurality of normalized chromatic and achromatic features can be processed to select a first estimated range of the bilirubin level from one of a plurality of different bilirubin ranges.
  • the features can be processed using an approach based on the selected first estimated range of the bilirubin level to generate a final estimate of the bilirubin level.
  • the color-balanced image data for the exposed skin region can be processed to determine a color space value for the patient's skin.
  • a plurality of normalized chromatic and achromatic features can be processed using an approach based on the determined patient's skin color space value to estimate the bilirubin level.
  • the mobile devices described herein can further include a flash unit operable to selectively illuminate the field of view.
  • the received image data processed to estimate the bilirubin level can include an image captured with the field of view being illuminated by the flash unit and an image captured with the field of view not being illuminated by the flash unit.
  • a method for estimating the level of bilirubin in a patient includes receiving, from a mobile device, image data for an image including a skin region of a patient and a color calibration target. Color-balanced image data for the skin region can be generated based on a subset of the image data corresponding to the color calibration target and the skin region, via one or more processors. The bilirubin level in the patient can be estimated based on the color-balanced image data for the skin region, via the one or more processors. The estimated bilirubin level can be transmitted to the mobile device.
  • At least one of receiving the image data and transmitting the estimated bilirubin level are performed using short message service (SMS) text messaging.
  • SMS short message service
  • the image data can be obtained by the mobile device without the use of external attachments, hardware add-ons, or any other specialized equipment.
  • FIG. 1A illustrates guidelines for the use of phototherapy to treat neonatal jaundice, in accordance with embodiments
  • FIG. IB illustrates guidelines for the use of exchange transfusion to treat neonatal jaundice, in accordance with embodiments
  • FIG. 1C illustrates a Bhutani nomogram for assessing risk associated with neonatal jaundice
  • FIG. 2 illustrates the use of a mobile device to capture image data used to estimate the bilirubin level in a patient, in accordance with many embodiments;
  • FIG. 3A through FIG. 3D illustrate color calibration targets used in conjunction with a mobile device to capture image data used to estimate bilirubin level, in accordance with many embodiments;
  • FIG. 4A through FIG. 4C illustrate exemplary user interfaces of a mobile application for estimating bilirubin level, in accordance with many embodiments
  • FIG. 5A through FIG. 5F illustrate exemplary image data collected for use in estimating bilirubin level, in accordance with many embodiments
  • FIG. 6 illustrates a method for estimating the bilirubin level in a patient, in accordance with many embodiments
  • FIG. 7 illustrates a method for generating color-balanced image data for a patient's skin, in accordance with many embodiments
  • FIG. 8 illustrates identification of a standardized color region in an image, in accordance with many embodiments
  • FIG. 9A illustrates a method for estimating the bilirubin level in a patient, in accordance with many embodiments
  • FIG. 9B illustrates another example of a method for estimating the bilirubin level in a patient, in accordance with many embodiments
  • FIG. 10 illustrates a method for estimating the bilirubin level in a patient, in accordance with many embodiments
  • FIG. 11A illustrates a method for estimating the bilirubin level using baseline skin color data, in accordance with many embodiments
  • FIG. 11B illustrates a method for generating baseline skin color data, in accordance with many embodiments
  • FIG. 12 illustrates a mobile device for estimating bilirubin level, in accordance with many embodiments.
  • FIG. 13 illustrates a mobile device in a communication with a data processing system for estimating bilirubin level, in accordance with many embodiments.
  • a mobile device configured with suitable software can be used to capture images of a patient's skin and a standardized color calibration target.
  • the image data of the skin and target can be used to generate color-balanced image data that is analyzed to estimate the bilirubin level in the patient.
  • the color- balanced image data can be transformed into a plurality of different color spaces in order to extract features representative of the yellowness of the skin, and these features can be used in a plurality of regressions to generate the bilirubin estimate.
  • the systems, devices, and methods described herein enable convenient, portable, and inexpensive bilirubin level estimation that can easily be performed by non-medical personnel using a personal mobile device, thereby improving the accessibility and cost-effectiveness of bilirubin monitoring.
  • the methods described herein can be performed on a mobile device without requiring the use of external attachments, hardware add-ons, or any other specialized mobile device equipment.
  • the disclosed techniques provide accurate bilirubin level estimation over a large range of bilirubin concentrations, in contrast to TcB measurement, which exhibit reduced accuracy at high bilirubin levels.
  • the methods described herein account for diversity in skin tones as well as different lighting conditions, thereby improving the accuracy and flexibility of non-invasive bilirubin level estimation.
  • FIG. 1A illustrates guidelines for the use of phototherapy to treat neonatal jaundice.
  • FIG. IB illustrates guidelines for the use of exchange transfusion to treat neonatal jaundice.
  • the guidelines can be used by a medical professional to determine the appropriate course of treatment, based on the infant's age, total serum bilirubin (TSB), number of weeks of gestation (e.g., > 35 weeks), and other risk factors.
  • Risk factors may include isoimmune hemolytic disease, G6PD deficiency, asphyxia, significant lethargy, temperature instability, sepsis, acidosis, or an albumin level lower than 3.0 grams per deciliter.
  • the curves depicted in FIGS. 1A and IB indicate exemplary treatment thresholds for low risk, medium risk, and high risk infants.
  • FIG. 1C illustrates a Bhutani nomogram 110 for assessing risk associated with neonatal jaundice.
  • the Bhutani nomogram 110 can be used by a medical professional to assess an infant's risk for developing hyperbilirubinemia based in the infant's postnatal age and bilirubin level.
  • the Bhutani nomogram 110 can include a plurality of percentile curves 112 used to define a low risk zone 114, a low intermediate risk zone 116, a high intermediate risk zone 118, and a high risk zone 120.
  • FIG. 2 illustrates mobile device-based estimation of bilirubin levels, in accordance with many embodiments.
  • An imaging device such as a mobile device 202 (e.g., a smartphone, tablet) is used to capture an image of a skin region 204 of a patient 206 and a color calibration target 208.
  • Suitable skin regions for the approaches described herein include the forehead and sternum, as well as any other prominent, flat regions of skin that are likely to be evenly lit. Additionally, since jaundice typically first appears on the forehead and slowly progresses downward on the body, regions closer to the forehead can potentially be more informative for diagnosis.
  • the color calibration target 208 is on the abdomen of the patient 206 near the sternum.
  • the mobile device 202 includes a camera (not shown) for recording image data, as well as a display 210 used for presenting a user interface (UI) of a mobile software application ("mobile app” or "app") for estimating bilirubin levels based on the image data.
  • the mobile device 202 can be a personal device of a user (e.g., a parent, medical professional, community health worker, etc.), such that the user need only install the mobile app on their personal device in order to perform the methods described herein, with no other instrumentation or hardware being needed besides the color calibration target 208.
  • the mobile device 202 can be used to practice the methods described herein independently of any further attachment or accessory to the mobile device, such as an external lens, filter, or other specialized mobile device equipment. Additional details on suitable software and hardware components for the mobile device 202 are provided in further detail below.
  • FIG. 3A through FIG. 3D illustrate color calibration targets that can be used in conjunction with a mobile device for estimating bilirubin levels, in accordance with many embodiments.
  • the color calibration targets described herein also known as “color calibration cards” and “color cards” can be used to account for differences in lighting or other
  • a color calibration target 300 can be provided on a rectangular card 302.
  • the card 302 can be cardstock of any size suitable for placement onto the skin of a patient (e.g., an infant), such as approximately the size of a business card. In some instances, the card 302 can be sterilizable or disposable, so as to prevent the spread of pathogens between patients.
  • the color calibration target 300 can include a plurality of standardized colored regions 304, which can be printed onto the card 302.
  • the colored regions 304 can be of any suitable size, number, or shape (e.g., square, rectangular, polygonal, circular, elliptical, etc.).
  • the color calibration target 300 is depicted as including eight identically-sized square colored regions 304a-h.
  • Each of the standardized colored regions 304 can be of a different color (e.g., black, gray, white, cyan, magenta, yellow, light brown, dark brown) and be positioned in a known arrangement on the card 302.
  • black, gray, white, cyan, magenta, yellow, light brown, dark brown e.g., black, gray, white, cyan, magenta, yellow, light brown, dark brown
  • 304a is a black region
  • 304b is a gray region (e.g., 50% gray)
  • 304c is a white region
  • 304d is a light brown region (e.g., a first skin tone)
  • 304e is a cyan region
  • 304f is a magenta region
  • 304g is a yellow region
  • 304h is a dark brown region (e.g., a second skin tone).
  • the back side (not shown) of the card 302 can include one or more adhesive regions enabling the card 302 to be removably attached to the patient's skin.
  • the back side can also include relevant instructions, such as instructions for the user on how to download the accompanying mobile app onto their personal mobile device.
  • FIG. 3B illustrates an alternative embodiment of a color calibration target 320.
  • the color calibration target 320 is substantially similar to the color calibration target 300, except it also defines an opening 322.
  • the opening 322 can be positioned such that when the color calibration target 300 is placed on the patient, the skin region of interest is exposed through the opening 322.
  • the opening 322 can be entirely defined (e.g., completely surrounded) or partially defined (e.g., partially surrounded) by the target 320.
  • a plurality of standardized color regions 324 including a white color region, can be positioned around the opening 322 in a known arrangement.
  • the embodiment of FIG. 3B further includes light gray, medium gray, and dark gray color regions for a total of 10 different standardized color regions.
  • FIG. 3C illustrates a color calibration target 340 provided on a square card 342, with a plurality of standardized color regions 344
  • FIG. 3D illustrates a color calibration target 360 having a plurality of standardized color regions 362 surrounding an opening 364.
  • the opening 364 can be adjacent to some or all of color regions 362.
  • the color regions 362 are depicted in FIG. 3D as a series of rectangular regions with differing aspect ratios.
  • the colors, geometry, and arrangement of the color regions of the color calibration targets described herein can be selected based on the image processing and analysis methods to be used, such as the embodiments discussed below.
  • FIG. 4 A through FIG. 4C illustrate exemplary user interfaces (UIs) of a mobile app for estimating bilirubin level, in accordance with many embodiments.
  • FIG. 4 A illustrates a summary UI 400 for displaying various statistics and metrics relating to a patient.
  • the UI 400 can include patient identification information 402 (e.g., a patient name, study ID number), time of birth 404, and any available bilirubin level results 406 (e.g., TSB and/or TcB results).
  • the UI 400 can also include a button 408 or other interactive elements enabling the user to collect image data (e.g., a video sample or photographic sample) of a patient. In some instances, video samples can be advantageous for eliminating issues of motion blur.
  • FIG. 4B illustrates an instruction UI 420 for assisting a user with capturing image data of a patient.
  • the UI 420 can include graphical and/or textual instructions 422 directing the user to perform the appropriate steps.
  • the instructions 422 can instruct the user to place a color calibration target onto the abdomen of the patient below the sternum.
  • the instructions 422 are adapted to be easily comprehended by individuals without medical training, thereby allowing non-medical professionals to operate the mobile app.
  • FIG. 4C illustrates a live preview UI 440 displayed to a user during the image capture process.
  • the UI 440 can include a video preview 442 of the current field of view of the camera of the mobile device.
  • the UI 440 can include one or more positioning targets 444 to assist the user in positioning the mobile device.
  • the positioning target 444 can be a box, frame, or colored region overlaid onto the video preview 422 to indicate the appropriate placement of the color calibration target.
  • the size and location of the positioning target 444 can be selected to constrain the distance of the mobile device from the patient to within an optimal range, and also to ensure that the field of view adequately captures the appropriate skin regions and the color calibration target.
  • the user can record image data by tapping a record button 446. Additional details regarding the image collection and analysis process are provided below.
  • FIG. 5A through FIG. 5F illustrate exemplary image data collected by the mobile app, in accordance with many embodiments.
  • the mobile app can display the image data to the user for review and approval.
  • the mobile app can include functionality for detecting the image quality, and alert the user to images that do not meet a quality threshold and thereby necessitate a retake.
  • FIG. 5 A illustrates an image 500 of sufficient quality, in which the lighting is satisfactory and the patient and color calibration target are both clearly visible.
  • FIG. 5B illustrates a deficient image 502, in which portions of the image are obstructed by glare.
  • FIG. 5C illustrates a deficient image 504, in which the image is too bright.
  • FIG. 5D illustrates a deficient image 506, in which the color calibration target is partially obstructed.
  • FIG. 5E illustrates a deficient image 508, in which the image is partially obscured by shadows.
  • FIG. 5F illustrates a deficient image 510, in which the image is too dark.
  • FIG. 6 illustrates a method 600 for estimating the bilirubin level in a patient, in accordance with many embodiments.
  • the method 600 can be practiced using any of the systems and devices disclosed herein, such as a mobile device or a separate computing system (e.g., a remote server) in communication with a mobile device.
  • Certain steps of the method 600, as with all other methods described herein, can be optional, or may be combined with or substituted for suitable steps of other methods disclosed herein.
  • image data is received for at least one image including a region of the patient's skin and a color calibration target.
  • the image data can be collected using the camera of a mobile device, as previously described herein.
  • Image data can include photographic data, video data, or suitable combinations thereof.
  • Photographic data and video data can be captured sequentially or simultaneously (e.g., images are taken during video recording).
  • the image data can be obtained with and/or without using flash illumination.
  • flash illumination can be used to cancel out environmental lighting, such that the lighting in the resultant images is determined solely by contributions from the flash illumination. This can be advantageous to produce more consistent lighting or in situations where the environmental lighting is suboptimal (e.g., too dark, strongly colored, etc.).
  • the image data is collected using a mobile app that controls the mobile device's flash unit, as well as the sequence and number of images taken.
  • the app can turn on the flash unit during the initial positioning of the mobile device, so that the user can assess the amount of glare in the image (e.g., via the video preview) and reposition the device as necessary to reduce or eliminate glare.
  • the app can control the mobile device to first obtain image data with the flash unit on and then obtain image data with the flash unit off, thereby generating two image sets with and without flash
  • the app can be configured to record a video of the patient and calibration target, with the flash unit on for the first half and off for the second half.
  • the app can also capture two photographs taken during the first and second halves of the video recording, respectively.
  • the overall length of the video can be any suitable time, such as approximately 10 seconds.
  • the timing of the image capture process can further be configured to ensure that the image sensor (e.g., charge-coupled device (CCD) array) of the mobile device has stabilized before the next image is taken.
  • the app can include a specified amount of delay time (e.g., three to four seconds) before recording each image set.
  • the mobile app can analyze the image data to determine whether it is of sufficient quality for subsequent use. For instance, the app can implement suitable image analysis techniques (e.g., computer vision) to assess image quality.
  • An exemplary procedure involves extracting the color calibration target from the captured image data, checking the color consistency across each color region of the calibration target (e.g., determine whether the standard deviation of pixel values for each color region falls below a predetermined threshold), and recommending that the user retake any images that do not pass this test.
  • the app can be configured to capture multiple sets of image data per session, so as to maximize the likelihood that at least some of the image data will meet the quality standards.
  • the approved images can then be processed on the mobile device or transmitted to a separate computing system for processing, as described in further detail below.
  • color-balanced image data for the skin region is generated based on a subset of the image data corresponding to the color calibration target and the skin region.
  • Color balancing can be performed in order to compensate for different lighting conditions, as the color content of the image data can vary considerably based on the environmental lighting (e.g., intensity, color, type (halogen fluorescent, natural, etc.)).
  • the image data is initially normalized by dividing the three red, green, blue (RGB) color channels by the overall luminescence of the image.
  • the observed pixel color values for one or more of the standardized color regions of the color calibration target can be used to determine the color balancing adjustments to be applied to the skin regions in the image data, as discussed below.
  • the bilirubin level in the patient is estimated based on the color-balanced image data for the skin region. Since hyperbilirubinemia produces a yellow discoloration of the skin, the bilirubin level can be determined based on the amount of yellow in the color-balanced skin region image data. This determination can be performed using any suitable technique, such as by transforming the color-balanced image data into a plurality of different color spaces selected to facilitate quantifying the overall yellowness of the skin. Image features generated from the transformed image data can be input into a series of machine learning regressions designed to estimate the concentration of bilirubin in the patient's body. These approaches are discussed in further detail below.
  • FIG. 7 illustrates a method 700 for generating color-balanced image data, in accordance with many embodiments.
  • the method 700 can be applied to photographic and/or video data obtained of a patient's skin region and color calibration target, as previously discussed.
  • the received image data is processed to identify the image data corresponding to the exposed skin region and the image data corresponding to the white color region.
  • the image data can be segmented to extract the pixel values of the color regions of the color calibration target and the skin regions of interest (e.g., sternum, forehead).
  • the mobile app UI includes a positioning target for aligning the calibration target (e.g., positioning target 444 of FIG. 4C), then the approximate pixel coordinates of the color calibration target in the image data are already known, and the search space for the color regions can be constrained accordingly.
  • the positioning of the color regions can be determined by identifying at least two color regions on the calibration target.
  • FIG. 8 illustrates identification of a standardized color region in an image, in accordance with many embodiments.
  • the color region segmentation process can be implemented via a suitable algorithm configured to apply predetermined thresholds to the image.
  • the algorithm since the cyan, magenta, and yellow color regions have distinct hues and relatively high saturation, the algorithm initially attempts to identify at least two of these regions.
  • the algorithm can convert the image data to a hue, saturation, value (HSV) color space and apply empirically determined thresholds to the hue and saturation channels, thereby obtaining a thresholded hue image 800 and thresholded saturation image 802.
  • An "AND" operation can be performed on the two thresholded images to separate the specified color region from the rest of the image, thereby producing a segmented image 804.
  • this information can be used to differentiate the color region from the noise in the image data (e.g., using edge detection, morphological operations, etc.).
  • the algorithm can then use contour detection to identify the boundary of the color region, with contour smoothing performed using suitable techniques such as the Douglas-Peuker algorithm. Since the overall arrangement of the color regions is known, once the positions of two color regions are determined, the overall orientation of the calibration target can be calculated and used to extrapolate the positions of the other regions, including the white color region.
  • the white color region data is processed to determine observed color values for the white color region.
  • color-balanced image data for the exposed skin region is generated based on the observed color values for the white color region.
  • K w (R ⁇ , G w ' , B ⁇ ) is the raw observed color values of the white region on the color calibration target.
  • FIG. 9A illustrates a method 900 for estimating the bilirubin level in a patient, in accordance with many embodiments.
  • the method 900 can be practiced using color-balanced RGB image data obtained as previously described herein.
  • the color-balanced RGB image data is transformed into at least one other color space to generate color-balanced image data for the exposed skin region for the at least one other color space.
  • the yellowness of the color-balanced skin region image data correlates to the bilirubin level in the patient's body.
  • the image sensor e.g., CCD or CMOS sensor
  • the image sensor interpolate reflected light into red, green, and blue wavelengths, which can prevent the image sensor from capturing the reflection of the yellow wavelength band clearly.
  • the color-balanced RGB data can be transformed into a plurality of different color spaces, such as cyan, magenta, and yellow (CMY); cyan, magenta, yellow, and black (CMYK); YCbCr; or Lab color spaces. Any suitable number or combination of color spaces can be used.
  • the RGB image data is transformed into CMYK, YCbCr, and Lab color spaces.
  • the RGB image data can be transformed into CMY, YCbCr, and Lab color spaces.
  • a plurality of normalized chromatic and achromatic features are processed to select a first estimated range of the bilirubin level from a plurality of different bilirubin ranges.
  • the features can be chromatic and/or achromatic (e.g., luminescence) values for the skin region, with each feature corresponding to a color space value of the color spaces used.
  • the features can include calculations of one or more color gradients across the skin the region. Any suitable number and combination of features can be used.
  • three features can be extracted from each of four color spaces (e.g., RGB, CMY, YCbCr, Lab; or RGB, CMYK, YCbCr, Lab) to obtain 12 chromatic and achromatic features.
  • features can be separately extracted from image data obtained with flash illumination and without flash illumination, respectively, resulting in a total of 24 chromatic and achromatic features.
  • the features can be normalized based on the overall luminescence of the image, as previously described herein with respect to act 620 of the method 600.
  • the extracted features can be used as inputs to machine learning regressions used for estimating the bilirubin level.
  • the regressions can utilize some or all of the extracted features, with the optimal subset of features to be used selected based on machine learning techniques.
  • the machine learning regressions described herein can be trained on suitable data sets, such as clinical patient data, and can incorporate any suitable number and combination of parametric and non-parametric regression models.
  • the initially calculated features and the output of the selected regressors are used to classify the estimated bilirubin level into one of a plurality of different bilirubin ranges, such as low, medium, and high ranges.
  • the features are processed using a processing approach based on the selected first estimated range of the bilirubin level to generate a final estimate of the bilirubin level.
  • the normalized chromatic and chromatic features can be used to inform one or more machine learning regressions.
  • the inputs to the regressions can differ based on whether the first estimated range of the bilirubin level is low, medium, or high, as provided in greater detail below.
  • the classification of the first estimated range can be used as input to the machine learning regressions.
  • the results of the initial regression performed in act 920 can also be used as input. This "two-tiered" approach to bilirubin estimation can be used to generate more accurate estimation results compared to direct estimation.
  • FIG. 9B illustrate a method 950 for estimating the bilirubin level in a patient, in accordance with many embodiments.
  • the method 950 can be practiced in combination with the method 900 in order to obtain a more accurate estimate of the bilirubin level. Similar to the method 900, the method 950 can be practiced using color-balanced RGB image data obtained as previously described herein.
  • act 960 the color-balanced RGB image data is transformed into at least one other color space to generate color-balanced image data for the exposed skin region for the at least one other color space, as discussed above with respect to act 910 of the method 900.
  • the color-balanced image data for the exposed skin region is processed to determine a color space value for the patient's skin.
  • the color space value for the patient's skin can be used to classify the patient's skin color into one of a plurality of different skin color types, such as light-skinned, medium-skinned, and dark-skinned.
  • the skin color type can be related to the race and/or ethnicity of the patient.
  • the skin color type can be determined based on the color values of the skin region and/or color calibration target obtained from the color-balanced RGB image data. For example, the skin region can be compared to one or more standardized color regions of the color calibration target (e.g., the first and second skin tone color regions) to determine a skin color type.
  • a plurality of normalized chromatic and achromatic features are processed using an approach based on the determined skin color to estimate the bilirubin level.
  • the normalized chromatic and achromatic features can be extracted from the color-balanced image data for one or more different color spaces, as previously described above with respect to the method 900.
  • the features, along with the determined skin color, can be used as inputs to suitable machine learning regressions, similar to the act 930 of the method 900.
  • FIG. 10 illustrates a method 1000 for estimating the bilirubin level in a patient, in accordance with many embodiments.
  • the method 1000 includes obtaining images from a camera 1002, color balancing the image data 1004, performing feature extraction from the color- balanced image data 1006, performing machine learning regression based on the features 1008, and generating a bilirubin estimate 1010.
  • Image data can be obtained from the camera of a mobile device under the control of a suitable mobile app (act 1002). Some of the image data can be obtained with flash illumination and some of the image data can be obtained without flash illumination. Once the app has verified that the images are of sufficient quality for processing and analysis, the image data can be color balanced (act 1004).
  • the color balancing can involve identification of the image data subsets corresponding to the color calibration target, and automatic segmentation of the standardized color regions of the target (act 1012) using the thresholding methods previously described herein.
  • the segmented white color region can be used to white balance the image data (act 1014), thereby generating color-balanced image data.
  • the color-balanced image data can be RGB image data.
  • One or more features can be extracted from the color-balanced image data (act 1006).
  • the feature extraction process can involve transforming the image data from an RGB color space to a plurality of different color spaces (act 1016), as previously described herein.
  • the transformed image data can then be used to calculate a plurality of normalized chromatic and achromatic features (act 1018).
  • an initial set of machine learning regressions can include five different regressions (acts 1020, 1022, 1024, 1026, and 1028).
  • the first regression can include one or more encapsulated k-Nearest Neighbor (kNN) regressions (act 1020).
  • kNN k-Nearest Neighbor
  • This regression can utilize a database of known features and bilirubin values.
  • the convex hull can be found around the test vector in the database of features. Feature points from the convex hull can then be used in a linear regression. A new regression can be built each time that a new test point is analyzed.
  • the parameters for creating the hull can be normalized values of luminosity (e.g., from the YCbCr color space transform) and the "green" channel (e.g., from the RGB color space). This can be used to guarantee that the convex hull calculation only occurs in two dimensions, thus ensuring the number of points in the convex hull is tractable (e.g., approximately four to six points).
  • the second regression can include one or more lasso regressions and/or one or more least angle regressions (LARS) (act 1022).
  • LARS can be helpful for deciding which features out of the total set of extracted features are the most useful, using a variant of forward feature selection.
  • the best predictor(s) from the feature set can be chosen by developing a single- feature, linear regression from each feature.
  • the most correlated output can be chosen as the "first" feature. This prediction can be subtracted from the output to obtain the residuals.
  • the algorithm can diverge from other forward feature selection algorithms in that it attempts to find another feature with roughly the same correlation to the residuals as the first feature to the output.
  • the third regression can include one or more elastic net regressions, also known as elastic net algorithms (act 1024).
  • the elastic net regression is a combination of lasso regression and ridge regression.
  • the algorithm can also employ the LI and L2 norms in its objective function. This makes it related to LARS and Lasso, but with certain "backoff regularization so that it can become more stable.
  • the parameters can be cross-validated using a stepwise exhaustive process.
  • the fourth regression can include one or more support vector (SV) regressions (act 1026).
  • SV regressions can be employed in order to capture the possible non-linear relationship between the image data and the bilirubin levels.
  • SV regression can be used to find a linear regression function in a high dimensional feature space. Then, the input data can be mapped into the space using a potentially nonlinear function.
  • the first SV regression can uses a linear kernel and the second SV regression can assign higher weight to higher-rated bilirubin values using a nonlinear radial basis function.
  • the fifth regression can include one or more random forest regressions (act 1028).
  • the fifth regression can use a random forest regression with 500 trees.
  • a random forest is a collection of estimators. It can use many "classifying" decision trees on various sub-samples of the dataset. The outputs of these trees can be averaged to improve the predictive accuracy and to control over-fitting. Each tree can be created using a random sub-sample (with replacement).
  • regressions can also be used, in addition to or substituted for one or more of the regressions described herein.
  • one or more linear regressions can be also used.
  • the method 1000 can be practiced using any suitable type of regression, including linear and non- linear regressions.
  • one or more multi-layer classifiers can be used (act 1030).
  • the initially calculated features, along with the output of the initial set of regressors, can be used to classify a first estimated range of the bilirubin level into one of a plurality of different bilirubin ranges, as previously described herein.
  • the results of all the classifiers, as well as the log-likelihood for each class can be used as the features for a final stacked regression, which can be a final random forest regression (act 1032).
  • the original extracted features and the results of the initial regressions can also be used as the features for this final stage regression.
  • the final regressor can be trained using suitable machine learning algorithms, such as AdaBoost.
  • AdaBoost suitable machine learning algorithms
  • the final regressor can be used as the final estimate of the bilirubin level 1010 (e.g., measured in milligrams per deciliter).
  • leave-one-out cross validation can be used at all levels of learning.
  • FIG. 11A illustrates a method 1100 for estimating the bilirubin level in a patient using baseline skin color data, in accordance with many embodiments.
  • the method 1100 is similar to the method 600, except that the current skin color data of the patient is compared to baseline skin color data in order to generate the bilirubin estimate.
  • This approach can be used to compensate for factors that may confound the image analysis, such as differing skin tones due to the patient's race or ethnicity.
  • baseline skin color data for the patient is received, the baseline skin color data corresponding to when the patient has a reference bilirubin level.
  • the reference bilirubin level can be a known bilirubin level (e.g., based on TSB or TcB testing). If the patient is an infant, the baseline skin color data can be collected within the first 24 hours of the infant's birth, when the bilirubin level is typically very low, such as approximately zero. Suitable methods for collecting and generating baseline skin color data are described below.
  • act 1120 image data is received for at least one image including a region of the patient's skin and a color calibration target.
  • color-balanced image data for the skin region is generated based on a subset of the image data corresponding to the color calibration target and the skin region.
  • the image data collection and color-balancing processes can be similar to those previously described herein with respect to acts 610 and 620 of the method 600, respectively.
  • the acts 1120 and 1130 can be performed at any time after the baseline skin color data is received and can be repeated over any suitable period of time (e.g., the first four to five days of the infant's life) so as to generate sequential image data sets used to determine whether the skin is becoming more yellow.
  • the bilirubin level in the patient is estimated based on differences between the baseline skin color data and the color-balanced image data for the skin region.
  • the baseline skin color data serves as a standard against which the current image data is compared.
  • the estimation can be performed by transforming the color-balanced image data into a plurality of color spaces, extracting features from the transformed image data, and then using the features for machine learning regressions to generate a bilirubin estimate. At least some of the regressions described herein can also use some or all of the features extracted from the baseline skin color data.
  • FIG. 11B illustrates a method 1150 for generating baseline skin color data, in accordance with many embodiments.
  • baseline image data for the patient is captured when the patient has the reference bilirubin level, the baseline image data corresponding to at least one image including the skin region and a baseline color calibration target.
  • the collection of the baseline image data can be similar to the collection procedures previously described herein with respect to act 610 of the method 600.
  • the baseline color calibration target can be the same as the color calibration target used for collecting regular image data, or can be a different color calibration target.
  • the baseline image data can include the same skin regions as regular image data, or different skin regions.
  • the reference bilirubin level can be any known bilirubin level, such as a bilirubin level determined by testing or taken within 24 hours of birth.
  • color-balanced baseline image data is generated for the skin region based on the baseline image data.
  • the generation of the color-balanced baseline image data can be performed using any of the techniques previously described herein with respect to regular image data.
  • the baseline skin color data is generated based on the color-balanced baseline image data for the skin region. This process can involve feature extraction from the color- balanced baseline image data and using the features for machine learning regressions, as discussed above.
  • FIG. 12 illustrates a mobile device 1200 for estimating bilirubin level in a patient, in accordance with many embodiments.
  • the mobile device 1200 includes a camera 1202 suitable for capturing image data, and a flash unit 1204 suitable for providing flash illumination.
  • the camera 1202 and flash unit 1204 can be built-in hardware of the mobile device 1200.
  • the camera 1202 and flash unit 1204 can be provided separately from and coupled to the mobile device 1200 (e.g., via wired or wireless communication).
  • the mobile apps described herein can detect whether the camera 1202 is capable of capturing images with sufficiently high resolution for the subsequent image analysis, and can alert the user if this criterion is not met.
  • the mobile device 1200 also includes an input unit 1206 for receiving input from a user and a display 1208 for displaying content to the user.
  • the input unit 1206 can include keyboards, mice, touchscreens, joysticks, and the like.
  • the input unit 1206 can also be configured to accept voice commands or gestural commands.
  • the display 1208 can include a monitor, screen, touchscreen, and the like. In some instances, the input unit 1206 and the display 1208 can be implemented across shared hardware (e.g., the same touchscreen is used to accept input and display output).
  • the display 1208 can display one or more suitable UIs to the user, such as UIs of a mobile app for estimating bilirubin levels.
  • the mobile device 1200 includes one or more processors 1210, a memory or other data storage device 1212 storing image data as well as one or more software modules 1214, and a communication unit 1216.
  • the processors 1210 can be operably coupled to the camera 1202 and/or flash unit 1204 to control one or more functions (e.g., record function, zoom function, flash illumination function).
  • the processor 1210 can also be operably coupled to the memory 1212 such that the processor 1210 can receive and execute instructions provided by the software module 1214.
  • the software module 1214 can be implemented as part of the mobile apps described herein and can provide instructions for carrying out one or more acts of the previously discussed methods. For example, the software module 1214 can enable the mobile device 1200 to capture image data of a patient and calibration target.
  • the software module 1214 can perform some or all of the image processing and analysis tasks disclosed above (e.g., color balancing, feature extraction, machine learning regression).
  • the software module 1214 can be adapted to a plurality of different types of mobile devices.
  • the mobile device 1200 can be configured to receive and install software updates, such as updates improving one or more image capture, processing, and analysis algorithms, thereby enabling the mobile app to be easily and quickly upgraded as necessary.
  • the communication unit 1216 of the mobile device 1200 can be configured to receive and/or transmit data (e.g., image data, bilirubin estimates, software updates, etc.) between the mobile device 1200 and a separate device or system, such as a remote server or other computing system.
  • data e.g., image data, bilirubin estimates, software updates, etc.
  • the communication unit can use any suitable combination of wired or wireless communication methods, including Wi-Fi communication. In some instances, the
  • the communication unit 1216 can also be operably coupled to the processors 1210, such that data communication to and from the mobile device 1200 can be controlled based on instructions provided by the software module 1214.
  • FIG. 13 illustrates a mobile device 1300 in communication with a data processing system 1302 for estimating bilirubin levels, in accordance with many embodiments.
  • the mobile device 1300 can include any of the components previously described herein with respect to the mobile device 1200 of FIG. 12.
  • the components of the data processing system 1302 can be
  • the data processing system 1302 is a remote server configured to communicate with a plurality of user mobile devices including the mobile device 1300. The communication can utilize any suitable wired or wireless communication methods, as described above.
  • the data processing system 1302 includes one or more processors 1304, a memory or other data storage device 1306 storing one or more software modules 1308, and a communication unit 1310.
  • the communication unit 1310 can be used to communicate data (e.g., image data, bilirubin estimates, software updates, etc.) between the system 1302 and the mobile device 1300 (e.g., via SMS text messaging).
  • the communication unit 1310 can receive image data provided by the mobile device 1300, such as image data that has not yet been color balanced.
  • the data obtained from the mobile device 1300 can be stored in the memory 1306.
  • the software module 1308 can provide instructions executable by the processors 1304 to process and analyze the image data (e.g., color balancing, feature extraction, machine learning
  • the processors 1304 can output an estimate of the patient's bilirubin level, which can be stored in the memory 1306 and/or transmitted to the mobile device 1300. In some instances, depending on user preference, the results can also be transmitted to a third party, such as a medical professional who can review the results and provide the user with further instructions as necessary.
  • a third party such as a medical professional who can review the results and provide the user with further instructions as necessary.
  • the mobile devices described herein can be configured to illuminate a patient's skin with different wavelengths of light (e.g., 460 nm, 540 nm) and capture images of the illuminated skin using a camera.
  • the timing and sequence of illumination can be controlled by a mobile app.
  • the mobile app can analyze the collected image data to measure the intensity of different wavelengths of light reflected from the skin regions.
  • the absorption of different wavelengths differs based on the color of the skin, including yellow discoloration.
  • Some wavelengths can be affected by bilirubin levels, such that their intensities provide an indication of the amount of bilirubin in the patient's body. Accordingly, suitable machine learning regressions and/or models can be developed to enable wavelength absorption to be used as input for estimating bilirubin levels in a patient.
  • this approach can be more robust to different environmental and situational conditions.
  • the mobile device includes a front facing camera (e.g., a camera disposed on the same side of the mobile device as the screen) can be used to capture image data (e.g., photographs, videos) that is used to assess ambient lighting conditions.
  • image data e.g., photographs, videos
  • Such data provides alternative and/or additional ambient lighting information that can be used to normalize the color data of the patient's skin during estimation of the bilirubin level of the patient.
  • Storage media and computer readable media for containing code, or portions of code can include any appropriate media known or used in the art, including storage media and communication media, such as but not limited to volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage and/or transmission of information such as computer readable instructions, data structures, program modules, or other data, including RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disk (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, solid state drives (SSD) or other solid state storage devices, or any other medium which can be used to store the desired information and which can be accessed by the a system device.
  • SSD solid state drives

Abstract

Systems, methods, and devices are provided for estimating bilirubin levels. In one aspect, a method for estimating the level of bilirubin in a patient includes receiving image data for at least one image including a region of the patient's skin and a color calibration target. Color-balanced image data for the skin region is generated based on a subset of the image data corresponding to the color calibration target and the skin region. The bilirubin level in the patient is estimated based on the color-balanced image data for the skin region.

Description

ESTIMATING BILIRUBIN LEVELS
CROSS-REFERENCE
[0001] This application claims the benefit of U.S. Provisional Application No. 61/777,097, filed March 12, 2013, which application is incorporated herein by reference.
BACKGROUND
[0002] An estimated 60-84% of newborn infants develop neonatal jaundice, which produces a yellowing of the skin caused by the accumulation of excess bilirubin (hyperbilirubinemia), a naturally occurring compound produced by the breakdown of red blood cells. Although this condition is typically harmless and resolves within a few days, highly elevated levels of bilirubin can lead to kernicterus, a devastating and irreversible neurological condition characterized by deafness, cerebral palsy, profound developmental delay, or even death.
[0003] Current approaches for monitoring bilirubin levels in infants typically require repeated testing in a hospital setting. The blood concentration of bilirubin can be determined by the total serum bilirubin (TSB), measured from a blood sample, or via a transcutaneous bilirubinometer (TcB) measurement accomplished using a non-invasive but costly instrument. These tests are often unavailable in resource-poor settings, thus impeding early detection and treatment of kernicterus. Visual assessments, which are frequently used as an alternative to these tests, are often inaccurate and can be confounded by factors such as lighting or skin tone. Accordingly, improved approaches are needed for providing non-invasive, cost-effective screening for excessive bilirubin levels.
SUMMARY
[0004] Systems, methods, and devices are provided for estimating bilirubin levels. In many embodiments, a mobile device is used to capture image data of a patient's skin and a color calibration target. The image data is processed to generate an estimation of the bilirubin level. The image processing can include transforming the image data into a plurality of different color spaces to facilitate assessment of the overall yellowness of the skin while compensating for color differences caused by lighting, skin tone, and other potentially confounding factors. The screening techniques described herein can be practiced by users (e.g., parents, medical professionals, community health workers) in an outpatient setting (e.g., a patient's home) without requiring specialized medical equipment, thereby improving the convenience, accessibility, and cost-effectiveness of bilirubin monitoring. [0005] Thus, in a first aspect, a method is provided for estimating the level of bilirubin in a patient. The method includes receiving image data for at least one image including a region of the patient's skin and a color calibration target. Color-balanced image data for the skin region is generated based on a subset of the image data corresponding to the color calibration target and the skin region. The bilirubin level in the patient is estimated based on the color-balanced image data for the skin region. In many embodiments, the image data can be obtained with any suitable imaging device. The imaging device can collect image data independently of additional attachments or equipment, such as external lenses, filters, or other specialized hardware.
[0006] The bilirubin level can be estimated using only image data of the patient's skin color at a particular point in time, or by comparing the skin color image data to baseline skin color image data. For example, the method can further include receiving baseline skin color data for the patient corresponding to when the patient has a reference bilirubin level (e.g., approximately zero when baseline data is obtained within 24 hours of birth). The bilirubin level can be estimated based on one or more differences between the baseline skin color data and the color-balanced image data for the skin region. The baseline skin color data for the patient can be generated by capturing baseline image data for the patient when the patient has the reference bilirubin level. The baseline image data can correspond to at least one image including the skin region and a baseline color calibration target. Color-balanced baseline image data for the skin region can be generated based on a subset of the baseline image data corresponding to the baseline color calibration target and the skin region. The baseline skin color data can be generated based on the color-balanced baseline image data for the skin region.
[0007] A standardized color calibration target can be used to facilitate the color balancing process. The color calibration target can include a plurality of standardized color regions, including a white color region. The standardized color regions can include a black region, a gray region, a light brown region, a cyan region, a magenta region, a yellow region, and a dark brown region. The color calibration target can at least partially define an opening configured to expose the skin region to permit capturing of image data for the skin region. The standardized color regions can be disposed in a known arrangement surrounding the opening. Accordingly, the process for generating color-balanced image data for the skin region can include processing the received image data to identify a subset of the image data corresponding to the exposed skin region and a subset of the image data corresponding to the white color region. The white color region data can be processed to determine observed color values for the white color region. Color-balanced image data for the exposed skin region can be generated based on the observed color values for the white color region. The observed color values for the white color region can include any suitable color space values, such as red, green, blue (RGB) color space values.
[0008] The image data can be transformed into a plurality of different color spaces in order to detect yellow discoloration of the skin. For example, the color-balanced image data for the skin region can include RGB color space data, and a method of estimating the level of bilirubin in a patient can further include transforming the RGB color space data into at least one other color space to generate color-balanced image data for the exposed skin region for the at least one other color space. The at least one color space can include: (a) a cyan, magenta, yellow, and black (CMYK) color space; (b) a YCbCr color space; and/or (c) a Lab color space.
[0009] A plurality of chromatic and achromatic features can be generated based on the image data. In some instances, the received image data can include an image obtained using flash illumination and an image obtained without using flash illumination. Estimating the bilirubin level can include processing a plurality of normalized chromatic and achromatic features to select a first estimated range of the bilirubin level from one of a plurality of different bilirubin ranges. The features can be processed using an approach based on the selected first estimated range of the bilirubin level to generate a final estimate of the bilirubin level. The plurality of different bilirubin ranges can include a low range, a medium range, and a high range. The plurality of features can include selected color values of the skin region for a plurality of different color spaces. In some instances, the plurality of features can include a calculation of a color gradient across the skin region.
[0010] Estimation of the bilirubin level can involve performing one or more regressions. For example, processing the features to select a first estimated range of the bilirubin level can include performing a series of regressions, including at least one of: (a) a linear regression, (b) an encapsulated k-Nearest Neighbor regression, (c) a lasso regression, (d) a LARS regression, (e) an elastic net regression, (f) a support vector regression using a linear kernel, (g) a support vector regression assigning higher weight to higher-rated bilirubin values, and (h) a random forest regression. Processing the features to select a first estimated range of the bilirubin levels can include performing a series of regressions, including: (a) a linear regression, (b) an encapsulated k-Nearest Neighbor regression, (c) a lasso regression, (d) a LARS regression, (e) an elastic net regression, (f) a support vector regression using a linear kernel, (g) a support vector regression assigning higher weight to higher-rated bilirubin values, and (h) a random forest regression. Using a processing approach based on the selected first estimated range of the bilirubin level can include performing a final random forest regression that uses the plurality of normalized chromatic and achromatic features and the selected first estimated range of the bilirubin level as the features for the final random forest regression. In some instances, estimating the bilirubin level includes determining color space value for the patient's skin and using a processing approach based on the determined color space value to estimate the bilirubin level. The regression equations can also include features from the baseline image (e.g., color space values from one or more color spaces).
[0011] In another aspect, a mobile device configured to estimate the level of bilirubin in a patient is provided. The device includes a camera operable to capture image data for a field of view, a processor operatively coupled with the camera, and a data storage device operatively coupled with the processor. The data storage device can store instructions that, when executed by the processor, cause the processor to receive image data for an image captured by the camera, the image including a region of the patient's skin and a color calibration target. The instructions can cause the processor to generate color-balanced image data for the skin region based on a subset of the image data corresponding to the color calibration target and the skin region, and estimate the bilirubin level in the patient based on the color-balanced image data for the skin region. In many embodiments, the mobile device can be used to estimate the bilirubin level independently of any external attachments to the mobile device (e.g., lenses, filters) or any other specialized mobile device equipment.
[0012] The color calibration target can at least partially define an opening configured to expose the skin region to permit capturing of image data for the skin region, and can include a plurality of standardized color regions including a white color region. The instructions can cause the processor to process the received image data to identify a subset of the image data corresponding to the exposed skin region and a subset of the image data corresponding to the white color region. The white color region data can be processed to determine observed color values for the white color region. Color-balanced RGB image data can be generated for the exposed skin region based on the observed color values for the white color region. Color-balanced image data for the exposed skin region can be generated for at least one other color space by transforming the color- balanced RGB image data into the at least one other color space. A plurality of normalized chromatic and achromatic features can be processed to select a first estimated range of the bilirubin level from one of a plurality of different bilirubin ranges. The features can be processed using an approach based on the selected first estimated range of the bilirubin level to generate a final estimate of the bilirubin level.
[0013] Alternatively or in addition, the color-balanced image data for the exposed skin region can be processed to determine a color space value for the patient's skin. A plurality of normalized chromatic and achromatic features can be processed using an approach based on the determined patient's skin color space value to estimate the bilirubin level.
[0014] The mobile devices described herein can further include a flash unit operable to selectively illuminate the field of view. The received image data processed to estimate the bilirubin level can include an image captured with the field of view being illuminated by the flash unit and an image captured with the field of view not being illuminated by the flash unit.
[0015] In another aspect, a method for estimating the level of bilirubin in a patient is provided. The method includes receiving, from a mobile device, image data for an image including a skin region of a patient and a color calibration target. Color-balanced image data for the skin region can be generated based on a subset of the image data corresponding to the color calibration target and the skin region, via one or more processors. The bilirubin level in the patient can be estimated based on the color-balanced image data for the skin region, via the one or more processors. The estimated bilirubin level can be transmitted to the mobile device. In many embodiments, at least one of receiving the image data and transmitting the estimated bilirubin level are performed using short message service (SMS) text messaging. The image data can be obtained by the mobile device without the use of external attachments, hardware add-ons, or any other specialized equipment.
[0016] Other objects and features of the present invention will become apparent by a review of the specification, claims, and appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
[0018] FIG. 1A illustrates guidelines for the use of phototherapy to treat neonatal jaundice, in accordance with embodiments;
[0019] FIG. IB illustrates guidelines for the use of exchange transfusion to treat neonatal jaundice, in accordance with embodiments;
[0020] FIG. 1C illustrates a Bhutani nomogram for assessing risk associated with neonatal jaundice;
[0021] FIG. 2 illustrates the use of a mobile device to capture image data used to estimate the bilirubin level in a patient, in accordance with many embodiments; [0022] FIG. 3A through FIG. 3D illustrate color calibration targets used in conjunction with a mobile device to capture image data used to estimate bilirubin level, in accordance with many embodiments;
[0023] FIG. 4A through FIG. 4C illustrate exemplary user interfaces of a mobile application for estimating bilirubin level, in accordance with many embodiments;
[0024] FIG. 5A through FIG. 5F illustrate exemplary image data collected for use in estimating bilirubin level, in accordance with many embodiments;
[0025] FIG. 6 illustrates a method for estimating the bilirubin level in a patient, in accordance with many embodiments;
[0026] FIG. 7 illustrates a method for generating color-balanced image data for a patient's skin, in accordance with many embodiments;
[0027] FIG. 8 illustrates identification of a standardized color region in an image, in accordance with many embodiments;
[0028] FIG. 9A illustrates a method for estimating the bilirubin level in a patient, in accordance with many embodiments;
[0029] FIG. 9B illustrates another example of a method for estimating the bilirubin level in a patient, in accordance with many embodiments;
[0030] FIG. 10 illustrates a method for estimating the bilirubin level in a patient, in accordance with many embodiments;
[0031] FIG. 11A illustrates a method for estimating the bilirubin level using baseline skin color data, in accordance with many embodiments;
[0032] FIG. 11B illustrates a method for generating baseline skin color data, in accordance with many embodiments;
[0033] FIG. 12 illustrates a mobile device for estimating bilirubin level, in accordance with many embodiments; and
[0034] FIG. 13 illustrates a mobile device in a communication with a data processing system for estimating bilirubin level, in accordance with many embodiments.
DETAILED DESCRIPTION
[0035] The systems, devices, and methods described herein provide improved approaches for estimating bilirubin level in a patient (e.g., an infant or an adult). A mobile device configured with suitable software can be used to capture images of a patient's skin and a standardized color calibration target. The image data of the skin and target can be used to generate color-balanced image data that is analyzed to estimate the bilirubin level in the patient. For example, the color- balanced image data can be transformed into a plurality of different color spaces in order to extract features representative of the yellowness of the skin, and these features can be used in a plurality of regressions to generate the bilirubin estimate. Contrary to existing approaches for measuring bilirubin level, which either rely upon an invasive blood test or utilize costly instrumentation, the systems, devices, and methods described herein enable convenient, portable, and inexpensive bilirubin level estimation that can easily be performed by non-medical personnel using a personal mobile device, thereby improving the accessibility and cost-effectiveness of bilirubin monitoring. Advantageously, the methods described herein can be performed on a mobile device without requiring the use of external attachments, hardware add-ons, or any other specialized mobile device equipment. Notably, the disclosed techniques provide accurate bilirubin level estimation over a large range of bilirubin concentrations, in contrast to TcB measurement, which exhibit reduced accuracy at high bilirubin levels. Additionally, the methods described herein account for diversity in skin tones as well as different lighting conditions, thereby improving the accuracy and flexibility of non-invasive bilirubin level estimation.
Furthermore, the use of mobile device software platforms enables easy and rapid updating of the estimation methods and algorithms described herein, thus allowing improvements and upgrades to be made as necessary.
[0036] Turning now to the drawings, FIG. 1A illustrates guidelines for the use of phototherapy to treat neonatal jaundice. Similarly, FIG. IB illustrates guidelines for the use of exchange transfusion to treat neonatal jaundice. The guidelines can be used by a medical professional to determine the appropriate course of treatment, based on the infant's age, total serum bilirubin (TSB), number of weeks of gestation (e.g., > 35 weeks), and other risk factors. Risk factors may include isoimmune hemolytic disease, G6PD deficiency, asphyxia, significant lethargy, temperature instability, sepsis, acidosis, or an albumin level lower than 3.0 grams per deciliter. The curves depicted in FIGS. 1A and IB indicate exemplary treatment thresholds for low risk, medium risk, and high risk infants.
[0037] FIG. 1C illustrates a Bhutani nomogram 110 for assessing risk associated with neonatal jaundice. The Bhutani nomogram 110 can be used by a medical professional to assess an infant's risk for developing hyperbilirubinemia based in the infant's postnatal age and bilirubin level. For example, the Bhutani nomogram 110 can include a plurality of percentile curves 112 used to define a low risk zone 114, a low intermediate risk zone 116, a high intermediate risk zone 118, and a high risk zone 120.
[0038] FIG. 2 illustrates mobile device-based estimation of bilirubin levels, in accordance with many embodiments. An imaging device, such as a mobile device 202 (e.g., a smartphone, tablet) is used to capture an image of a skin region 204 of a patient 206 and a color calibration target 208. Suitable skin regions for the approaches described herein include the forehead and sternum, as well as any other prominent, flat regions of skin that are likely to be evenly lit. Additionally, since jaundice typically first appears on the forehead and slowly progresses downward on the body, regions closer to the forehead can potentially be more informative for diagnosis.
Accordingly, in many embodiments, the color calibration target 208 is on the abdomen of the patient 206 near the sternum. The mobile device 202 includes a camera (not shown) for recording image data, as well as a display 210 used for presenting a user interface (UI) of a mobile software application ("mobile app" or "app") for estimating bilirubin levels based on the image data. The mobile device 202 can be a personal device of a user (e.g., a parent, medical professional, community health worker, etc.), such that the user need only install the mobile app on their personal device in order to perform the methods described herein, with no other instrumentation or hardware being needed besides the color calibration target 208. Notably, the mobile device 202 can be used to practice the methods described herein independently of any further attachment or accessory to the mobile device, such as an external lens, filter, or other specialized mobile device equipment. Additional details on suitable software and hardware components for the mobile device 202 are provided in further detail below.
[0039] FIG. 3A through FIG. 3D illustrate color calibration targets that can be used in conjunction with a mobile device for estimating bilirubin levels, in accordance with many embodiments. The color calibration targets described herein (also known as "color calibration cards" and "color cards") can be used to account for differences in lighting or other
environmental conditions that affect the resultant color balance of the skin image data. Referring to FIG. 3 A, a color calibration target 300 can be provided on a rectangular card 302. The card 302 can be cardstock of any size suitable for placement onto the skin of a patient (e.g., an infant), such as approximately the size of a business card. In some instances, the card 302 can be sterilizable or disposable, so as to prevent the spread of pathogens between patients. The color calibration target 300 can include a plurality of standardized colored regions 304, which can be printed onto the card 302. The colored regions 304 can be of any suitable size, number, or shape (e.g., square, rectangular, polygonal, circular, elliptical, etc.). For example, the color calibration target 300 is depicted as including eight identically-sized square colored regions 304a-h. Each of the standardized colored regions 304 can be of a different color (e.g., black, gray, white, cyan, magenta, yellow, light brown, dark brown) and be positioned in a known arrangement on the card 302. For example, in the embodiment of FIG. 3A, 304a is a black region, 304b is a gray region (e.g., 50% gray), 304c is a white region, 304d is a light brown region (e.g., a first skin tone), 304e is a cyan region, 304f is a magenta region, 304g is a yellow region, and 304h is a dark brown region (e.g., a second skin tone). Other arrangements and combinations of colors can also be used. Furthermore, the back side (not shown) of the card 302 can include one or more adhesive regions enabling the card 302 to be removably attached to the patient's skin. The back side can also include relevant instructions, such as instructions for the user on how to download the accompanying mobile app onto their personal mobile device.
[0040] FIG. 3B illustrates an alternative embodiment of a color calibration target 320. The color calibration target 320 is substantially similar to the color calibration target 300, except it also defines an opening 322. The opening 322 can be positioned such that when the color calibration target 300 is placed on the patient, the skin region of interest is exposed through the opening 322. The opening 322 can be entirely defined (e.g., completely surrounded) or partially defined (e.g., partially surrounded) by the target 320. A plurality of standardized color regions 324, including a white color region, can be positioned around the opening 322 in a known arrangement. The embodiment of FIG. 3B further includes light gray, medium gray, and dark gray color regions for a total of 10 different standardized color regions. FIG. 3C illustrates a color calibration target 340 provided on a square card 342, with a plurality of standardized color regions 344
surrounding a square opening 346. Some of the color regions 344 can include more than one color, such as color region 348, which includes a central black square bordered by white. FIG. 3D illustrates a color calibration target 360 having a plurality of standardized color regions 362 surrounding an opening 364. The opening 364 can be adjacent to some or all of color regions 362. The color regions 362 are depicted in FIG. 3D as a series of rectangular regions with differing aspect ratios. The colors, geometry, and arrangement of the color regions of the color calibration targets described herein can be selected based on the image processing and analysis methods to be used, such as the embodiments discussed below.
[0041] FIG. 4 A through FIG. 4C illustrate exemplary user interfaces (UIs) of a mobile app for estimating bilirubin level, in accordance with many embodiments. FIG. 4 A illustrates a summary UI 400 for displaying various statistics and metrics relating to a patient. The UI 400 can include patient identification information 402 (e.g., a patient name, study ID number), time of birth 404, and any available bilirubin level results 406 (e.g., TSB and/or TcB results). The UI 400 can also include a button 408 or other interactive elements enabling the user to collect image data (e.g., a video sample or photographic sample) of a patient. In some instances, video samples can be advantageous for eliminating issues of motion blur. FIG. 4B illustrates an instruction UI 420 for assisting a user with capturing image data of a patient. The UI 420 can include graphical and/or textual instructions 422 directing the user to perform the appropriate steps. For example, the instructions 422 can instruct the user to place a color calibration target onto the abdomen of the patient below the sternum. In many embodiments, the instructions 422 are adapted to be easily comprehended by individuals without medical training, thereby allowing non-medical professionals to operate the mobile app. FIG. 4C illustrates a live preview UI 440 displayed to a user during the image capture process. The UI 440 can include a video preview 442 of the current field of view of the camera of the mobile device. The UI 440 can include one or more positioning targets 444 to assist the user in positioning the mobile device. For example, the positioning target 444 can be a box, frame, or colored region overlaid onto the video preview 422 to indicate the appropriate placement of the color calibration target. The size and location of the positioning target 444 can be selected to constrain the distance of the mobile device from the patient to within an optimal range, and also to ensure that the field of view adequately captures the appropriate skin regions and the color calibration target. Once the field of view is correctly aligned, the user can record image data by tapping a record button 446. Additional details regarding the image collection and analysis process are provided below.
[0042] FIG. 5A through FIG. 5F illustrate exemplary image data collected by the mobile app, in accordance with many embodiments. Following the recording process, the mobile app can display the image data to the user for review and approval. Furthermore, in order to ensure that the captured images are of sufficient quality for processing and analysis, the mobile app can include functionality for detecting the image quality, and alert the user to images that do not meet a quality threshold and thereby necessitate a retake. FIG. 5 A illustrates an image 500 of sufficient quality, in which the lighting is satisfactory and the patient and color calibration target are both clearly visible. FIG. 5B illustrates a deficient image 502, in which portions of the image are obstructed by glare. FIG. 5C illustrates a deficient image 504, in which the image is too bright. FIG. 5D illustrates a deficient image 506, in which the color calibration target is partially obstructed. FIG. 5E illustrates a deficient image 508, in which the image is partially obscured by shadows. FIG. 5F illustrates a deficient image 510, in which the image is too dark.
[0043] FIG. 6 illustrates a method 600 for estimating the bilirubin level in a patient, in accordance with many embodiments. The method 600, as with all other methods described herein, can be practiced using any of the systems and devices disclosed herein, such as a mobile device or a separate computing system (e.g., a remote server) in communication with a mobile device. Certain steps of the method 600, as with all other methods described herein, can be optional, or may be combined with or substituted for suitable steps of other methods disclosed herein.
[0044] In act 610, image data is received for at least one image including a region of the patient's skin and a color calibration target. The image data can be collected using the camera of a mobile device, as previously described herein. Image data can include photographic data, video data, or suitable combinations thereof. Photographic data and video data can be captured sequentially or simultaneously (e.g., images are taken during video recording). The image data can be obtained with and/or without using flash illumination. In some instances, flash illumination can be used to cancel out environmental lighting, such that the lighting in the resultant images is determined solely by contributions from the flash illumination. This can be advantageous to produce more consistent lighting or in situations where the environmental lighting is suboptimal (e.g., too dark, strongly colored, etc.).
[0045] In many embodiments, the image data is collected using a mobile app that controls the mobile device's flash unit, as well as the sequence and number of images taken. For example, the app can turn on the flash unit during the initial positioning of the mobile device, so that the user can assess the amount of glare in the image (e.g., via the video preview) and reposition the device as necessary to reduce or eliminate glare. During the recording process, the app can control the mobile device to first obtain image data with the flash unit on and then obtain image data with the flash unit off, thereby generating two image sets with and without flash
illumination, respectively. For example, when the user initiates the image capture process (e.g., by pressing a record button), the app can be configured to record a video of the patient and calibration target, with the flash unit on for the first half and off for the second half. The app can also capture two photographs taken during the first and second halves of the video recording, respectively. The overall length of the video can be any suitable time, such as approximately 10 seconds. The timing of the image capture process can further be configured to ensure that the image sensor (e.g., charge-coupled device (CCD) array) of the mobile device has stabilized before the next image is taken. For example, the app can include a specified amount of delay time (e.g., three to four seconds) before recording each image set.
[0046] As previously discussed, once the image capture sequence is complete, the mobile app can analyze the image data to determine whether it is of sufficient quality for subsequent use. For instance, the app can implement suitable image analysis techniques (e.g., computer vision) to assess image quality. An exemplary procedure involves extracting the color calibration target from the captured image data, checking the color consistency across each color region of the calibration target (e.g., determine whether the standard deviation of pixel values for each color region falls below a predetermined threshold), and recommending that the user retake any images that do not pass this test. In some instances, the app can be configured to capture multiple sets of image data per session, so as to maximize the likelihood that at least some of the image data will meet the quality standards. The approved images can then be processed on the mobile device or transmitted to a separate computing system for processing, as described in further detail below.
[0047] In act 620, color-balanced image data for the skin region is generated based on a subset of the image data corresponding to the color calibration target and the skin region. Color balancing can be performed in order to compensate for different lighting conditions, as the color content of the image data can vary considerably based on the environmental lighting (e.g., intensity, color, type (halogen fluorescent, natural, etc.)). In many embodiments, the image data is initially normalized by dividing the three red, green, blue (RGB) color channels by the overall luminescence of the image. Furthermore, the observed pixel color values for one or more of the standardized color regions of the color calibration target can be used to determine the color balancing adjustments to be applied to the skin regions in the image data, as discussed below.
[0048] In act 630, the bilirubin level in the patient is estimated based on the color-balanced image data for the skin region. Since hyperbilirubinemia produces a yellow discoloration of the skin, the bilirubin level can be determined based on the amount of yellow in the color-balanced skin region image data. This determination can be performed using any suitable technique, such as by transforming the color-balanced image data into a plurality of different color spaces selected to facilitate quantifying the overall yellowness of the skin. Image features generated from the transformed image data can be input into a series of machine learning regressions designed to estimate the concentration of bilirubin in the patient's body. These approaches are discussed in further detail below.
[0049] FIG. 7 illustrates a method 700 for generating color-balanced image data, in accordance with many embodiments. The method 700 can be applied to photographic and/or video data obtained of a patient's skin region and color calibration target, as previously discussed. In act 710, the received image data is processed to identify the image data corresponding to the exposed skin region and the image data corresponding to the white color region. For example, the image data can be segmented to extract the pixel values of the color regions of the color calibration target and the skin regions of interest (e.g., sternum, forehead). With respect to the color calibration target, if the mobile app UI includes a positioning target for aligning the calibration target (e.g., positioning target 444 of FIG. 4C), then the approximate pixel coordinates of the color calibration target in the image data are already known, and the search space for the color regions can be constrained accordingly. The positioning of the color regions can be determined by identifying at least two color regions on the calibration target.
[0050] FIG. 8 illustrates identification of a standardized color region in an image, in accordance with many embodiments. The color region segmentation process can be implemented via a suitable algorithm configured to apply predetermined thresholds to the image. In many embodiments, since the cyan, magenta, and yellow color regions have distinct hues and relatively high saturation, the algorithm initially attempts to identify at least two of these regions. The algorithm can convert the image data to a hue, saturation, value (HSV) color space and apply empirically determined thresholds to the hue and saturation channels, thereby obtaining a thresholded hue image 800 and thresholded saturation image 802. An "AND" operation can be performed on the two thresholded images to separate the specified color region from the rest of the image, thereby producing a segmented image 804. Furthermore, since the approximate size of each color region is predetermined, this information can be used to differentiate the color region from the noise in the image data (e.g., using edge detection, morphological operations, etc.). The algorithm can then use contour detection to identify the boundary of the color region, with contour smoothing performed using suitable techniques such as the Douglas-Peuker algorithm. Since the overall arrangement of the color regions is known, once the positions of two color regions are determined, the overall orientation of the calibration target can be calculated and used to extrapolate the positions of the other regions, including the white color region.
[0051] In act 720, the white color region data is processed to determine observed color values for the white color region. In act 730, color-balanced image data for the exposed skin region is generated based on the observed color values for the white color region. The observed color values for the white color region can include RGB color values, which can be used to adjust the RGB values of the image data for the skin region. For example, for observed color values for the white region x = (R', G ', B'), the adjusted color values for the skin region (R, G, B) can be obtained by
Figure imgf000015_0002
Figure imgf000015_0001
Figure imgf000015_0003
where Kw = (R^, Gw' , B^) is the raw observed color values of the white region on the color calibration target.
[0052] FIG. 9A illustrates a method 900 for estimating the bilirubin level in a patient, in accordance with many embodiments. The method 900 can be practiced using color-balanced RGB image data obtained as previously described herein. In act 910, the color-balanced RGB image data is transformed into at least one other color space to generate color-balanced image data for the exposed skin region for the at least one other color space. As discussed above, the yellowness of the color-balanced skin region image data correlates to the bilirubin level in the patient's body. In many embodiments, the image sensor (e.g., CCD or CMOS sensor) of the mobile device interpolate reflected light into red, green, and blue wavelengths, which can prevent the image sensor from capturing the reflection of the yellow wavelength band clearly.
Accordingly, the color-balanced RGB data can be transformed into a plurality of different color spaces, such as cyan, magenta, and yellow (CMY); cyan, magenta, yellow, and black (CMYK); YCbCr; or Lab color spaces. Any suitable number or combination of color spaces can be used. For example, in many embodiments, the RGB image data is transformed into CMYK, YCbCr, and Lab color spaces. Alternatively, the RGB image data can be transformed into CMY, YCbCr, and Lab color spaces.
[0053] In act 920, a plurality of normalized chromatic and achromatic features are processed to select a first estimated range of the bilirubin level from a plurality of different bilirubin ranges. The features can be chromatic and/or achromatic (e.g., luminescence) values for the skin region, with each feature corresponding to a color space value of the color spaces used. In some instances, the features can include calculations of one or more color gradients across the skin the region. Any suitable number and combination of features can be used. For example, three features can be extracted from each of four color spaces (e.g., RGB, CMY, YCbCr, Lab; or RGB, CMYK, YCbCr, Lab) to obtain 12 chromatic and achromatic features. Furthermore, features can be separately extracted from image data obtained with flash illumination and without flash illumination, respectively, resulting in a total of 24 chromatic and achromatic features. The features can be normalized based on the overall luminescence of the image, as previously described herein with respect to act 620 of the method 600. The extracted features can be used as inputs to machine learning regressions used for estimating the bilirubin level. The regressions can utilize some or all of the extracted features, with the optimal subset of features to be used selected based on machine learning techniques. The machine learning regressions described herein can be trained on suitable data sets, such as clinical patient data, and can incorporate any suitable number and combination of parametric and non-parametric regression models.
Exemplary regressions suitable for use with the methods described herein are provided below. In many embodiments, the initially calculated features and the output of the selected regressors are used to classify the estimated bilirubin level into one of a plurality of different bilirubin ranges, such as low, medium, and high ranges.
[0054] In act 930, the features are processed using a processing approach based on the selected first estimated range of the bilirubin level to generate a final estimate of the bilirubin level.
Similar to the act 920, the normalized chromatic and chromatic features can be used to inform one or more machine learning regressions. The inputs to the regressions can differ based on whether the first estimated range of the bilirubin level is low, medium, or high, as provided in greater detail below. For example, the classification of the first estimated range can be used as input to the machine learning regressions. The results of the initial regression performed in act 920 can also be used as input. This "two-tiered" approach to bilirubin estimation can be used to generate more accurate estimation results compared to direct estimation.
[0055] FIG. 9B illustrate a method 950 for estimating the bilirubin level in a patient, in accordance with many embodiments. The method 950 can be practiced in combination with the method 900 in order to obtain a more accurate estimate of the bilirubin level. Similar to the method 900, the method 950 can be practiced using color-balanced RGB image data obtained as previously described herein. In act 960, the color-balanced RGB image data is transformed into at least one other color space to generate color-balanced image data for the exposed skin region for the at least one other color space, as discussed above with respect to act 910 of the method 900.
[0056] In act 970, the color-balanced image data for the exposed skin region is processed to determine a color space value for the patient's skin. The color space value for the patient's skin can be used to classify the patient's skin color into one of a plurality of different skin color types, such as light-skinned, medium-skinned, and dark-skinned. The skin color type can be related to the race and/or ethnicity of the patient. The skin color type can be determined based on the color values of the skin region and/or color calibration target obtained from the color-balanced RGB image data. For example, the skin region can be compared to one or more standardized color regions of the color calibration target (e.g., the first and second skin tone color regions) to determine a skin color type.
[0057] In act 980, a plurality of normalized chromatic and achromatic features are processed using an approach based on the determined skin color to estimate the bilirubin level. The normalized chromatic and achromatic features can be extracted from the color-balanced image data for one or more different color spaces, as previously described above with respect to the method 900. The features, along with the determined skin color, can be used as inputs to suitable machine learning regressions, similar to the act 930 of the method 900.
[0058] FIG. 10 illustrates a method 1000 for estimating the bilirubin level in a patient, in accordance with many embodiments. The method 1000 includes obtaining images from a camera 1002, color balancing the image data 1004, performing feature extraction from the color- balanced image data 1006, performing machine learning regression based on the features 1008, and generating a bilirubin estimate 1010.
[0059] Image data can be obtained from the camera of a mobile device under the control of a suitable mobile app (act 1002). Some of the image data can be obtained with flash illumination and some of the image data can be obtained without flash illumination. Once the app has verified that the images are of sufficient quality for processing and analysis, the image data can be color balanced (act 1004). The color balancing can involve identification of the image data subsets corresponding to the color calibration target, and automatic segmentation of the standardized color regions of the target (act 1012) using the thresholding methods previously described herein. The segmented white color region can be used to white balance the image data (act 1014), thereby generating color-balanced image data. The color-balanced image data can be RGB image data. One or more features can be extracted from the color-balanced image data (act 1006). The feature extraction process can involve transforming the image data from an RGB color space to a plurality of different color spaces (act 1016), as previously described herein. The transformed image data can then be used to calculate a plurality of normalized chromatic and achromatic features (act 1018).
[0060] Some or all of the extracted features can be used as inputs to machine learning
regressions (act 1008). Any suitable number and/or combination of regressions can be used. For example, an initial set of machine learning regressions can include five different regressions (acts 1020, 1022, 1024, 1026, and 1028). For example, the first regression can include one or more encapsulated k-Nearest Neighbor (kNN) regressions (act 1020). This regression can utilize a database of known features and bilirubin values. When an unknown test vector is analyzed, the convex hull can be found around the test vector in the database of features. Feature points from the convex hull can then be used in a linear regression. A new regression can be built each time that a new test point is analyzed. The parameters for creating the hull can be normalized values of luminosity (e.g., from the YCbCr color space transform) and the "green" channel (e.g., from the RGB color space). This can be used to guarantee that the convex hull calculation only occurs in two dimensions, thus ensuring the number of points in the convex hull is tractable (e.g., approximately four to six points).
[0061] The second regression can include one or more lasso regressions and/or one or more least angle regressions (LARS) (act 1022). LARS can be helpful for deciding which features out of the total set of extracted features are the most useful, using a variant of forward feature selection. For example, the best predictor(s) from the feature set can be chosen by developing a single- feature, linear regression from each feature. The most correlated output can be chosen as the "first" feature. This prediction can be subtracted from the output to obtain the residuals. The algorithm can diverge from other forward feature selection algorithms in that it attempts to find another feature with roughly the same correlation to the residuals as the first feature to the output. It can then find the "equiangular" direction between the two estimates, and can find a third feature that maximizes correlation to the new residuals along the equiangular direction. A new angle can then be found from the previous features and a new feature added to the set. Features can be added in this way until the desired accuracy is met.
[0062] The third regression can include one or more elastic net regressions, also known as elastic net algorithms (act 1024). The elastic net regression is a combination of lasso regression and ridge regression. Instead of just using forward feature selection, however, the algorithm can also employ the LI and L2 norms in its objective function. This makes it related to LARS and Lasso, but with certain "backoff regularization so that it can become more stable. The parameters can be cross-validated using a stepwise exhaustive process.
[0063] The fourth regression can include one or more support vector (SV) regressions (act 1026). Two SV regressions can be employed in order to capture the possible non-linear relationship between the image data and the bilirubin levels. SV regression can be used to find a linear regression function in a high dimensional feature space. Then, the input data can be mapped into the space using a potentially nonlinear function. The first SV regression can uses a linear kernel and the second SV regression can assign higher weight to higher-rated bilirubin values using a nonlinear radial basis function.
[0064] The fifth regression can include one or more random forest regressions (act 1028). For example, the fifth regression can use a random forest regression with 500 trees. A random forest is a collection of estimators. It can use many "classifying" decision trees on various sub-samples of the dataset. The outputs of these trees can be averaged to improve the predictive accuracy and to control over-fitting. Each tree can be created using a random sub-sample (with replacement).
[0065] In many embodiments, other types of regressions can also be used, in addition to or substituted for one or more of the regressions described herein. For example, one or more linear regressions can be also used. The method 1000 can be practiced using any suitable type of regression, including linear and non- linear regressions.
[0066] Following the initial regressions, one or more multi-layer classifiers can be used (act 1030). For example, the initially calculated features, along with the output of the initial set of regressors, can be used to classify a first estimated range of the bilirubin level into one of a plurality of different bilirubin ranges, as previously described herein. The classifiers can include a random forest classifier, a support vector machine, and a k-Nearest Neighbor (k = 3). The results of all the classifiers, as well as the log-likelihood for each class, can be used as the features for a final stacked regression, which can be a final random forest regression (act 1032). The original extracted features and the results of the initial regressions can also be used as the features for this final stage regression. The final regressor can be trained using suitable machine learning algorithms, such as AdaBoost. The final regressor can be used as the final estimate of the bilirubin level 1010 (e.g., measured in milligrams per deciliter). In order to avoid overfitting, leave-one-out cross validation can be used at all levels of learning.
[0067] FIG. 11A illustrates a method 1100 for estimating the bilirubin level in a patient using baseline skin color data, in accordance with many embodiments. The method 1100 is similar to the method 600, except that the current skin color data of the patient is compared to baseline skin color data in order to generate the bilirubin estimate. This approach can be used to compensate for factors that may confound the image analysis, such as differing skin tones due to the patient's race or ethnicity.
[0068] In act 1110, baseline skin color data for the patient is received, the baseline skin color data corresponding to when the patient has a reference bilirubin level. The reference bilirubin level can be a known bilirubin level (e.g., based on TSB or TcB testing). If the patient is an infant, the baseline skin color data can be collected within the first 24 hours of the infant's birth, when the bilirubin level is typically very low, such as approximately zero. Suitable methods for collecting and generating baseline skin color data are described below.
[0069] In act 1120, image data is received for at least one image including a region of the patient's skin and a color calibration target. In act 1130, color-balanced image data for the skin region is generated based on a subset of the image data corresponding to the color calibration target and the skin region. The image data collection and color-balancing processes can be similar to those previously described herein with respect to acts 610 and 620 of the method 600, respectively. The acts 1120 and 1130 can be performed at any time after the baseline skin color data is received and can be repeated over any suitable period of time (e.g., the first four to five days of the infant's life) so as to generate sequential image data sets used to determine whether the skin is becoming more yellow.
[0070] In act 1140, the bilirubin level in the patient is estimated based on differences between the baseline skin color data and the color-balanced image data for the skin region. In many embodiments, the baseline skin color data serves as a standard against which the current image data is compared. As previously described herein with respect to act 630 of the method 600, the estimation can be performed by transforming the color-balanced image data into a plurality of color spaces, extracting features from the transformed image data, and then using the features for machine learning regressions to generate a bilirubin estimate. At least some of the regressions described herein can also use some or all of the features extracted from the baseline skin color data.
[0071] FIG. 11B illustrates a method 1150 for generating baseline skin color data, in accordance with many embodiments. In act 1160, baseline image data for the patient is captured when the patient has the reference bilirubin level, the baseline image data corresponding to at least one image including the skin region and a baseline color calibration target. The collection of the baseline image data can be similar to the collection procedures previously described herein with respect to act 610 of the method 600. The baseline color calibration target can be the same as the color calibration target used for collecting regular image data, or can be a different color calibration target. Similarly, the baseline image data can include the same skin regions as regular image data, or different skin regions. As described above, the reference bilirubin level can be any known bilirubin level, such as a bilirubin level determined by testing or taken within 24 hours of birth.
[0072] In act 1170, color-balanced baseline image data is generated for the skin region based on the baseline image data. The generation of the color-balanced baseline image data can be performed using any of the techniques previously described herein with respect to regular image data. In act 1180, the baseline skin color data is generated based on the color-balanced baseline image data for the skin region. This process can involve feature extraction from the color- balanced baseline image data and using the features for machine learning regressions, as discussed above.
[0073] FIG. 12 illustrates a mobile device 1200 for estimating bilirubin level in a patient, in accordance with many embodiments. The mobile device 1200 includes a camera 1202 suitable for capturing image data, and a flash unit 1204 suitable for providing flash illumination. The camera 1202 and flash unit 1204 can be built-in hardware of the mobile device 1200.
Alternatively, the camera 1202 and flash unit 1204 can be provided separately from and coupled to the mobile device 1200 (e.g., via wired or wireless communication). In some instances, the mobile apps described herein can detect whether the camera 1202 is capable of capturing images with sufficiently high resolution for the subsequent image analysis, and can alert the user if this criterion is not met.
[0074] The mobile device 1200 also includes an input unit 1206 for receiving input from a user and a display 1208 for displaying content to the user. The input unit 1206 can include keyboards, mice, touchscreens, joysticks, and the like. The input unit 1206 can also be configured to accept voice commands or gestural commands. The display 1208 can include a monitor, screen, touchscreen, and the like. In some instances, the input unit 1206 and the display 1208 can be implemented across shared hardware (e.g., the same touchscreen is used to accept input and display output). As previously described, the display 1208 can display one or more suitable UIs to the user, such as UIs of a mobile app for estimating bilirubin levels. [0075] The mobile device 1200 includes one or more processors 1210, a memory or other data storage device 1212 storing image data as well as one or more software modules 1214, and a communication unit 1216. The processors 1210 can be operably coupled to the camera 1202 and/or flash unit 1204 to control one or more functions (e.g., record function, zoom function, flash illumination function). The processor 1210 can also be operably coupled to the memory 1212 such that the processor 1210 can receive and execute instructions provided by the software module 1214. The software module 1214 can be implemented as part of the mobile apps described herein and can provide instructions for carrying out one or more acts of the previously discussed methods. For example, the software module 1214 can enable the mobile device 1200 to capture image data of a patient and calibration target. In some instances, the software module 1214 can perform some or all of the image processing and analysis tasks disclosed above (e.g., color balancing, feature extraction, machine learning regression). The software module 1214 can be adapted to a plurality of different types of mobile devices. Furthermore, the mobile device 1200 can be configured to receive and install software updates, such as updates improving one or more image capture, processing, and analysis algorithms, thereby enabling the mobile app to be easily and quickly upgraded as necessary.
[0076] The communication unit 1216 of the mobile device 1200 can be configured to receive and/or transmit data (e.g., image data, bilirubin estimates, software updates, etc.) between the mobile device 1200 and a separate device or system, such as a remote server or other computing system. The communication unit can use any suitable combination of wired or wireless communication methods, including Wi-Fi communication. In some instances, the
communication between the mobile device 1200 and the separate device can be performed using short message service (SMS) text messaging. The communication unit 1216 can also be operably coupled to the processors 1210, such that data communication to and from the mobile device 1200 can be controlled based on instructions provided by the software module 1214.
[0077] FIG. 13 illustrates a mobile device 1300 in communication with a data processing system 1302 for estimating bilirubin levels, in accordance with many embodiments. The mobile device 1300 can include any of the components previously described herein with respect to the mobile device 1200 of FIG. 12. The components of the data processing system 1302 can be
implemented across any suitable combination of physical and/or virtualized computing resources (e.g., virtual machines), including distributed computing resources ("in the cloud"). In many embodiments, the data processing system 1302 is a remote server configured to communicate with a plurality of user mobile devices including the mobile device 1300. The communication can utilize any suitable wired or wireless communication methods, as described above. [0078] The data processing system 1302 includes one or more processors 1304, a memory or other data storage device 1306 storing one or more software modules 1308, and a communication unit 1310. The communication unit 1310 can be used to communicate data (e.g., image data, bilirubin estimates, software updates, etc.) between the system 1302 and the mobile device 1300 (e.g., via SMS text messaging). For example, the communication unit 1310 can receive image data provided by the mobile device 1300, such as image data that has not yet been color balanced. The data obtained from the mobile device 1300 can be stored in the memory 1306. The software module 1308 can provide instructions executable by the processors 1304 to process and analyze the image data (e.g., color balancing, feature extraction, machine learning
regressions), such as by performing one or more acts of the methods described herein. The processors 1304 can output an estimate of the patient's bilirubin level, which can be stored in the memory 1306 and/or transmitted to the mobile device 1300. In some instances, depending on user preference, the results can also be transmitted to a third party, such as a medical professional who can review the results and provide the user with further instructions as necessary.
[0079] In an alternative embodiment, the mobile devices described herein can be configured to illuminate a patient's skin with different wavelengths of light (e.g., 460 nm, 540 nm) and capture images of the illuminated skin using a camera. The timing and sequence of illumination can be controlled by a mobile app. The mobile app can analyze the collected image data to measure the intensity of different wavelengths of light reflected from the skin regions. In many embodiments, the absorption of different wavelengths differs based on the color of the skin, including yellow discoloration. Some wavelengths can be affected by bilirubin levels, such that their intensities provide an indication of the amount of bilirubin in the patient's body. Accordingly, suitable machine learning regressions and/or models can be developed to enable wavelength absorption to be used as input for estimating bilirubin levels in a patient. Advantageously, this approach can be more robust to different environmental and situational conditions.
[0080] In many embodiments, the mobile device includes a front facing camera (e.g., a camera disposed on the same side of the mobile device as the screen) can be used to capture image data (e.g., photographs, videos) that is used to assess ambient lighting conditions. Such data provides alternative and/or additional ambient lighting information that can be used to normalize the color data of the patient's skin during estimation of the bilirubin level of the patient.
[0081] The various techniques described herein may be partially or fully implemented using code that is storable upon storage media and computer readable media, and executable by one or more processors of a computer system. Storage media and computer readable media for containing code, or portions of code, can include any appropriate media known or used in the art, including storage media and communication media, such as but not limited to volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage and/or transmission of information such as computer readable instructions, data structures, program modules, or other data, including RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disk (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, solid state drives (SSD) or other solid state storage devices, or any other medium which can be used to store the desired information and which can be accessed by the a system device. Based on the disclosure and teachings provided herein, a person of ordinary skill in the art will appreciate other ways and/or methods to implement the various embodiments.
[0082] While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A method of estimating the level of bilirubin in a patient, the method comprising: receiving image data for at least one image including a region of the patient's skin and a color calibration target;
generating color-balanced image data for the skin region based on a subset of the image data corresponding to the color calibration target and the skin region; and
estimating the bilirubin level in the patient based on the color-balanced image data for the skin region.
2. The method of claim 1, further comprising receiving baseline skin color data for the patient corresponding to when the patient has a reference bilirubin level, and wherein said estimating the bilirubin level is based on one or more differences between the baseline skin color data and the color-balanced image data for the skin region.
3. The method of claim 2, wherein the baseline skin color data for the patient is generated by:
capturing baseline image data for the patient when the patient has the reference bilirubin level, the baseline image data corresponding to at least one image including the skin region and a baseline color calibration target;
generating color-balanced baseline image data for the skin region based on a subset of the baseline image data corresponding to the baseline color calibration target and the skin region; and
generating the baseline skin color data based on the color-balanced baseline image data for the skin region.
4. The method of claim 1, wherein the color calibration target comprises a plurality of standardized color regions including a white color region.
5. The method of claim 4, wherein the standardized color regions include a black region, a gray region, a light brown region, a cyan region, a magenta region, a yellow region, and a dark brown region.
6. The method of claim 4, wherein the color calibration target at least partially defines an opening configured to expose the skin region to permit capturing of image data for the skin region.
7. The method of claim 6, wherein the standardized color regions are disposed in a known arrangement surrounding the opening.
8. The method of claim 6, wherein said generating color-balanced image data for the skin region comprises:
processing the received image data to identify a subset of the image data corresponding to the exposed skin region and a subset of the image data corresponding to the white color region;
processing the white color region data to determine observed color values for the white color region; and
generating color-balanced image data for the exposed skin region based on the observed color values for the white color region.
9. The method of claim 8, wherein the observed color values for the white color region comprise red, green, blue (RGB) color space values.
10. The method of claim 1, wherein the color-balanced image data for the skin region comprises RGB color space data, and the method further comprises transforming the RGB color space data into at least one other color space to generate color-balanced image data for the exposed skin region for the at least one other color space.
11. The method of claim 10, wherein the at least one other color space includes: (a) a cyan, magenta, yellow, and black (CMYK) color space; (b) a YCbCr color space; or (c) a Lab color space.
12. The method of claim 10, wherein the at least one other color space include: (a) a cyan, magenta, yellow (CMY) color space; (b) a YCbCr color space; or (c) a Lab color space.
13. The method of claim 11, wherein the received image data includes an image obtained using flash illumination and an image obtained without using flash illumination.
14. The method of claim 13, wherein said estimating the bilirubin level comprises: processing a plurality of normalized chromatic and achromatic features to select a first estimated range of the bilirubin level from one of a plurality of different bilirubin ranges; and
processing the features using an approach based on the selected first estimated range of the bilirubin level to generate a final estimate of the bilirubin level.
15. The method of claim 14, wherein the plurality of different bilirubin ranges includes a low range, a medium range, and a high range.
16. The method of claim 14, wherein the plurality of features comprises selected color values of the skin region for a plurality of different color spaces.
17. The method of claim 14, wherein the plurality of features comprise a calculation of a color gradient across the skin region.
18. The method of claim 14, wherein processing the features to select a first estimated range of the bilirubin level comprises performing a series of regressions including at least one of: (a) a linear regression, (b) an encapsulated k-Nearest Neighbor regression, (c) a lasso regression, (d) a LARS regression, (e) an elastic net regression, (f) a support vector regression using a linear kernel, (g) a support vector regression assigning higher weight to higher-rated bilirubin values, and (h) a random forest regression.
19. The method of claim 14, wherein processing the features to select a first estimated range of the bilirubin level comprises performing a series of regressions including: (a) a linear regression, an encapsulated k-Nearest Neighbor regression, (c) a lasso regression, (d) a LARS regression, (e) an elastic net regression, (f) a support vector regression using a linear kernel, (g) a support vector regression assigning higher weight to higher-rated bilirubin values, and (h) a random forest regression.
20. The method of claim 14, wherein said using a processing approach based on the selected first estimated range of the bilirubin level comprises performing a final random forest regression that uses the plurality of normalized chromatic and achromatic features and the selected first estimated range of the bilirubin level as the features for the final random forest regression.
21. The method of claim 1 , wherein said estimating the bilirubin level comprises determining a color space value for the patient's skin and using a processing approach based on the determined patient's skin color space value to estimate the bilirubin level.
22. A mobile device configured to estimate the level of bilirubin in a patient, the device comprising:
a camera operable to capture image data for a field of view;
a processor operatively coupled with the camera; and
a data storage device operatively coupled with the processor and storing instructions that, when executed by the processor, cause the processor to:
receive image data for an image captured by the camera, the image including a region of the patient's skin and a color calibration target;
generate color-balanced image data for the skin region based on a subset of the image data corresponding to the color calibration target and the skin region; and
estimate the bilirubin level in the patient based on the color-balanced image data for the skin region.
23. The mobile device of claim 22, wherein the color calibration target at least partially defines an opening configured to expose the skin region to permit capturing image data for the skin region and includes a plurality of standardized color regions including a white color region, and wherein the instructions cause the processor to:
process the received image data to identify a subset of the image data corresponding to the exposed skin region and a subset of the image data corresponding to the white color region;
process the white color region data to determine observed color values for the white color region;
generate color-balanced RGB image data for the exposed skin region based on the observed color values for the white color region;
generate color-balanced image data for the exposed skin region for at least one other color space by transforming the color-balanced RGB image data into the at least one other color space; process a plurality of normalized chromatic and achromatic features to select a first estimated range of the bilirubin level from one of a plurality of different bilirubin ranges; and
process the features using an approach based on the selected first estimated range of the bilirubin level to generate a final estimate of the bilirubin level.
24. The mobile device of claim 22, wherein the color calibration target at least partially defines an opening configured to expose the skin region to permit capturing image data for the skin region and includes a plurality of standardized color regions including a white color region, and wherein the instructions cause the processor to:
process the received image data to identify a subset of the image data corresponding to the exposed skin region and a subset of the image data corresponding to the white color region;
process the white color region data to determine observed color values for the white color region;
generate color-balanced RGB image data for the exposed skin region based on the observed color values for the white color region;
generate color-balanced image data for the exposed skin region for at least one other color space by transforming the color-balanced RGB image data into the at least one other color space;
process the color-balanced image data for the exposed skin region to determine a skin color for the patient; and
process a plurality of normalized chromatic and achromatic features using an approach based on the determined skin color to estimate the bilirubin level.
25. The mobile device of either one of claim 23 and claim 24, further comprising a flash unit operable to selectively illuminate the field of view, the received image data processed to estimate the bilirubin level including an image captured with the field of view being illuminated by the flash unit and an image captured with the field of view not being illuminated by the flash unit.
26. A method for estimating the level of bilirubin in a patient, the method comprising: receiving, from a mobile device, image data for an image including a skin region of a patient and a color calibration target;
generating, via one or more processors, color-balanced image data for the skin region based on a subset of the image data corresponding to the color calibration target and the skin region;
estimating, via the one or more processors, the bilirubin level in the patient based on the color-balanced image data for the skin region; and
transmitting the estimated bilirubin level to the mobile device.
27. The method of claim 25, wherein at least one of receiving the image data and transmitting the estimated bilirubin level are performed using short message service (SMS) text messaging.
PCT/US2014/024761 2013-03-12 2014-03-12 Estimating bilirubin levels WO2014172033A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP14784645.5A EP2967359A4 (en) 2013-03-12 2014-03-12 Estimating bilirubin levels
JP2016501634A JP6545658B2 (en) 2013-03-12 2014-03-12 Estimating bilirubin levels
KR1020157028278A KR102237583B1 (en) 2013-03-12 2014-03-12 Estimating bilirubin levels
US14/835,348 US10285624B2 (en) 2013-03-12 2015-08-25 Systems, devices, and methods for estimating bilirubin levels

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361777097P 2013-03-12 2013-03-12
US61/777,097 2013-03-12

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/835,348 Continuation-In-Part US10285624B2 (en) 2013-03-12 2015-08-25 Systems, devices, and methods for estimating bilirubin levels

Publications (1)

Publication Number Publication Date
WO2014172033A1 true WO2014172033A1 (en) 2014-10-23

Family

ID=51731746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/024761 WO2014172033A1 (en) 2013-03-12 2014-03-12 Estimating bilirubin levels

Country Status (4)

Country Link
EP (1) EP2967359A4 (en)
JP (1) JP6545658B2 (en)
KR (1) KR102237583B1 (en)
WO (1) WO2014172033A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015009140A (en) * 2013-06-28 2015-01-19 信旭 周 Jaundice measurement program and jaundice measurement system using the same
CN105577982A (en) * 2015-12-31 2016-05-11 深圳市金立通信设备有限公司 Image processing method and terminal
WO2016145877A1 (en) * 2015-03-19 2016-09-22 深圳贝申医疗技术有限公司 Colorimetric card for accurately determining newborn skin color
WO2017111606A1 (en) * 2015-12-22 2017-06-29 Picterus As Image based bilirubin determination
US10028675B2 (en) 2012-05-10 2018-07-24 University Of Washington Through Its Center For Commercialization Sound-based spirometric devices, systems and methods
EP3507774A4 (en) * 2016-08-30 2019-12-25 Konica Minolta Laboratory U.S.A., Inc. Method and system for capturing images for wound assessment with self color compensation
CN110796642A (en) * 2019-10-09 2020-02-14 陈浩能 Method for determining fruit quality degree and related product
WO2020257108A1 (en) * 2019-06-18 2020-12-24 3Derm Systems, Inc. Using a set of machine learning diagnostic models to determine a diagnosis based on a skin tone of a patient
CN112401873A (en) * 2020-11-18 2021-02-26 南京信息职业技术学院 Calibration color plate of neonatal jaundice tester, preparation method and calibration method
WO2022132882A1 (en) * 2020-12-17 2022-06-23 Colgate-Palmolive Company System and device for measuring a color value, and methods thereof
EP4282330A1 (en) * 2022-05-25 2023-11-29 AI Labs Group, S.L. Ai marker device, method for standardising an image using the ai marker device and method for grading the severity of a skin disease using both
EP4292515A1 (en) * 2022-06-16 2023-12-20 Koninklijke Philips N.V. Non-contact subject monitoring
CN117379033A (en) * 2023-12-13 2024-01-12 深圳市宗匠科技有限公司 Skin pigment detection method, device, computer equipment and storage medium

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10945637B2 (en) 2016-12-28 2021-03-16 Ajou University Industry-Academic Cooperation Foundation Image based jaundice diagnosing method and apparatus and image based jaundice diagnosis assisting apparatus
KR101998595B1 (en) * 2017-10-13 2019-07-11 아주대학교산학협력단 Method and Apparatus for jaundice diagnosis based on an image
EP3462176A1 (en) * 2017-09-29 2019-04-03 Universität Basel Method and computer program for predicting bilirubin levels in neonates

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6045502A (en) * 1996-01-17 2000-04-04 Spectrx, Inc. Analyzing system with disposable calibration device
US6128516A (en) * 1994-05-09 2000-10-03 Chromatics Color Sciences International Inc. Method and apparatus for detecting and measuring conditions affecting color
US6178341B1 (en) * 1997-12-18 2001-01-23 Chromatics Color Sciences International, Inc. Color measurement system with color index for skin, teeth, hair and material substances
US6615061B1 (en) * 1998-11-23 2003-09-02 Abbott Laboratories Optical sensor having a selectable sampling distance for determination of analytes
US20080288227A1 (en) * 2007-05-18 2008-11-20 University Of Michigan Algorithms to predict clinical response, adherence, and shunting with thiopuriness
US20110206254A1 (en) * 2010-02-22 2011-08-25 Canfield Scientific, Incorporated Reflectance imaging and analysis for evaluating tissue pigmentation
US8154612B2 (en) * 2005-08-18 2012-04-10 Qualcomm Incorporated Systems, methods, and apparatus for image processing, for color classification, and for skin color detection

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5671735A (en) * 1983-07-18 1997-09-30 Chromatics Color Sciences International, Inc. Method and apparatus for detecting and measuring conditions affecting color
ATE229174T1 (en) * 1995-06-07 2002-12-15 Chromatics Color Sciences Int METHOD AND DEVICE FOR DETECTING AND MEASURING CONDITIONS WHICH AFFECT COLOR
JP3417235B2 (en) * 1996-12-13 2003-06-16 ミノルタ株式会社 Diagnostic system
JP2000121439A (en) * 1998-10-15 2000-04-28 Sekisui Chem Co Ltd Apparatus and method for determining color
US7711403B2 (en) * 2001-04-05 2010-05-04 Rhode Island Hospital Non-invasive determination of blood components
EP2131697B1 (en) * 2007-03-08 2012-09-12 Hewlett-Packard Development Company, L.P. Method and system for recommending a product based upon skin color estimated from an image
US7856118B2 (en) * 2007-07-20 2010-12-21 The Procter & Gamble Company Methods for recommending a personal care product and tools therefor

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6128516A (en) * 1994-05-09 2000-10-03 Chromatics Color Sciences International Inc. Method and apparatus for detecting and measuring conditions affecting color
US6045502A (en) * 1996-01-17 2000-04-04 Spectrx, Inc. Analyzing system with disposable calibration device
US6178341B1 (en) * 1997-12-18 2001-01-23 Chromatics Color Sciences International, Inc. Color measurement system with color index for skin, teeth, hair and material substances
US6615061B1 (en) * 1998-11-23 2003-09-02 Abbott Laboratories Optical sensor having a selectable sampling distance for determination of analytes
US8154612B2 (en) * 2005-08-18 2012-04-10 Qualcomm Incorporated Systems, methods, and apparatus for image processing, for color classification, and for skin color detection
US20080288227A1 (en) * 2007-05-18 2008-11-20 University Of Michigan Algorithms to predict clinical response, adherence, and shunting with thiopuriness
US20110206254A1 (en) * 2010-02-22 2011-08-25 Canfield Scientific, Incorporated Reflectance imaging and analysis for evaluating tissue pigmentation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2967359A4 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10028675B2 (en) 2012-05-10 2018-07-24 University Of Washington Through Its Center For Commercialization Sound-based spirometric devices, systems and methods
JP2015009140A (en) * 2013-06-28 2015-01-19 信旭 周 Jaundice measurement program and jaundice measurement system using the same
WO2016145877A1 (en) * 2015-03-19 2016-09-22 深圳贝申医疗技术有限公司 Colorimetric card for accurately determining newborn skin color
US10799150B2 (en) 2015-12-22 2020-10-13 Picterus As Image based bilirubin determination
CN108430326A (en) * 2015-12-22 2018-08-21 皮克特鲁斯公司 Bilirubin is determined based on image
WO2017111606A1 (en) * 2015-12-22 2017-06-29 Picterus As Image based bilirubin determination
CN105577982A (en) * 2015-12-31 2016-05-11 深圳市金立通信设备有限公司 Image processing method and terminal
EP3507774A4 (en) * 2016-08-30 2019-12-25 Konica Minolta Laboratory U.S.A., Inc. Method and system for capturing images for wound assessment with self color compensation
CN114269231A (en) * 2019-06-18 2022-04-01 数字诊断公司 Determining a diagnosis based on a patient's skin tone using a set of machine-learned diagnostic models
WO2020257108A1 (en) * 2019-06-18 2020-12-24 3Derm Systems, Inc. Using a set of machine learning diagnostic models to determine a diagnosis based on a skin tone of a patient
CN110796642A (en) * 2019-10-09 2020-02-14 陈浩能 Method for determining fruit quality degree and related product
CN112401873A (en) * 2020-11-18 2021-02-26 南京信息职业技术学院 Calibration color plate of neonatal jaundice tester, preparation method and calibration method
WO2022132882A1 (en) * 2020-12-17 2022-06-23 Colgate-Palmolive Company System and device for measuring a color value, and methods thereof
EP4282330A1 (en) * 2022-05-25 2023-11-29 AI Labs Group, S.L. Ai marker device, method for standardising an image using the ai marker device and method for grading the severity of a skin disease using both
WO2023227286A1 (en) * 2022-05-25 2023-11-30 Ai Labs Group, S.L. Calibrating a digital image of skin tissue
EP4292515A1 (en) * 2022-06-16 2023-12-20 Koninklijke Philips N.V. Non-contact subject monitoring
WO2023241986A1 (en) * 2022-06-16 2023-12-21 Koninklijke Philips N.V. Non-contact subject monitoring
CN117379033A (en) * 2023-12-13 2024-01-12 深圳市宗匠科技有限公司 Skin pigment detection method, device, computer equipment and storage medium
CN117379033B (en) * 2023-12-13 2024-02-20 深圳市宗匠科技有限公司 Skin pigment detection method, device, computer equipment and storage medium

Also Published As

Publication number Publication date
JP6545658B2 (en) 2019-07-17
EP2967359A1 (en) 2016-01-20
KR102237583B1 (en) 2021-04-07
JP2016516475A (en) 2016-06-09
EP2967359A4 (en) 2017-01-11
KR20150128916A (en) 2015-11-18

Similar Documents

Publication Publication Date Title
US10285624B2 (en) Systems, devices, and methods for estimating bilirubin levels
KR102237583B1 (en) Estimating bilirubin levels
Dimauro et al. A new method and a non-invasive device to estimate anemia based on digital images of the conjunctiva
US11382558B2 (en) Skin feature imaging system
US20190133514A1 (en) System and method for optical detection of skin disease
US9773320B2 (en) Method for estimating a quantity of a blood component in a fluid canister
US10945637B2 (en) Image based jaundice diagnosing method and apparatus and image based jaundice diagnosis assisting apparatus
CN104880412B (en) Freshness information output method and freshness information output device
US10682089B2 (en) Information processing apparatus, information processing method, and program
US20160338603A1 (en) Signal processing device, signal processing method, and computer-readable recording medium
EP3466324A1 (en) Skin diagnostic device and skin diagnostic method
JP6356141B2 (en) Medical device or system for measuring hemoglobin levels during an accident using a camera-projector system
KR20130024065A (en) Apparatus and method for detecting complexion, apparatus and method for determinig health using complexion, apparatus and method for generating health sort function
JP6266948B2 (en) Jaundice measurement system
US10726282B2 (en) Biometric authentication apparatus, biometric authentication system and biometric authentication method
KR101957773B1 (en) Evaluation method for skin condition using image and evaluation apparatus for skin condition using image
EP3023936B1 (en) Diagnostic apparatus and image processing method in the same apparatus
JP2021148793A (en) Method for determining characteristics of sample container in in-vitro diagnostics system, analyzing device, and in-vitro diagnostics system
EP3066976A1 (en) Organ image capturing device
US20230386660A1 (en) System and method for detecting gastrointestinal disorders
Ahmed et al. Automatic Region of Interest Extraction from Finger Nail Images for Measuring Blood Hemoglobin Level
Outlaw et al. Smartphone screening for neonatal jaundice via ambient-subtracted sclera chromaticity: neoSCB app pilot study
WO2023275774A1 (en) Non-invasive device for quickly estimating anaemia
CN114270399A (en) Information processing device, pulse wave measurement system, and pulse wave measurement program
JP2021171365A (en) Method and device for measuring moisture content of lip

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14784645

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016501634

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2014784645

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20157028278

Country of ref document: KR

Kind code of ref document: A