US20100266165A1 - Methods and systems for biometric identification - Google Patents

Methods and systems for biometric identification Download PDF

Info

Publication number
US20100266165A1
US20100266165A1 US12/770,118 US77011810A US2010266165A1 US 20100266165 A1 US20100266165 A1 US 20100266165A1 US 77011810 A US77011810 A US 77011810A US 2010266165 A1 US2010266165 A1 US 2010266165A1
Authority
US
United States
Prior art keywords
iris
template
image
generating
templates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/770,118
Inventor
James R. Matey
James R. Bergen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/770,118 priority Critical patent/US20100266165A1/en
Publication of US20100266165A1 publication Critical patent/US20100266165A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/28Determining representative reference patterns, e.g. by averaging or distorting; Generating dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/772Determining representative reference patterns, e.g. averaging or distorting patterns; Generating dictionaries

Definitions

  • Embodiments of the present invention generally relate to biometric identification systems and in particular to biometric identification systems using iris recognition methods.
  • the basic principles of iris recognition are summarized in FIG. 1 .
  • the subject iris 10 is illuminated with light from controlled and ambient light sources 12 .
  • the camera 14 and the controlled illumination 16 are at some defined standoff distance 18 , 20 , from the subject.
  • the camera and lens 14 (possibly filtered) acquires an image that is then captured by a computer 22 .
  • the iris image 24 is then segmented, normalized and an iris template (commonly called an iris code) 28 is generated. Segmentation identifies the boundaries of the pupil and iris. Normalization remaps the iris region between the pupil and the sclera (the white of the eye) into a form convenient for template generation and removes the effects of pupil dilation using a suitable model.
  • the template is then matched against an existing database 30 of previously enrolled irises; a match against the database indicates the current iris is the same iris used to create the template in the database.
  • iris recognition techniques suffer from several difficulties for some applications.
  • One difficulty is that when used in systems with large (e.g., 500,000) people in the database, such systems require a large number of servers to accommodate the large number of people in the databases.
  • iris recognition algorithms recommend a resolution of the order of 200 pixels across the iris.
  • acquisition of iris images of sufficient quality for iris recognition is challenging, particularly from a distance.
  • Current commercially available iris cameras require substantial cooperation on the part of the subject.
  • Two simple metrics for the required degree of cooperation are the capture volume and the residence time in the capture volume.
  • the capture volume is the three dimensional volume into which an eye must be placed and held for some period of time (the residence time) in order for the system to reliably capture an iris image of sufficient quality for iris recognition.
  • the residence time the spatial extent of the capture volume to be as large as possible and the residence time to be as small as possible.
  • a related issue is the standoff distance, the distance between the subject and the iris image acquisition system.
  • Existing systems require reasonably close proximity, in some cases an ‘in your face’ proximity.
  • Existing iris recognition algorithms are generally good enough for most applications in which the subject is sufficiently cooperative.
  • a challenge resides in reducing constraints on the subject so iris recognition is easier to use.
  • Embodiments of the present invention address these and other needs and generally relate to methods for improving the acquisition of iris images, improving methods of generating iris templates and for improving methods for searching template databases.
  • Embodiments of the present invention generally relate to methods for improving performance of an iris recognition system.
  • the methods can include using iris image registration and averaging steps to improve performance.
  • the iris image is averaged.
  • the segmented iris image is averaged.
  • the normalized iris image is averaged.
  • the biometric template is averaged.
  • the term ‘averaged’ denotes a combining of two or more entities, such as iris templates, based on a set of rules based on information that can be contained both within and external to the entity, or iris template.
  • the term average, as used herein, is not restricted to the narrower concept of an arithmetic mean. The purpose of the averaging is to create an entity which is better than any one of the individual entities that have been averaged.
  • Embodiments of the present invention include methods for iris recognition in which false match rates are balanced against other performance parameters.
  • an iris recognition method for capturing iris images that provide increased capture volume, decreased acquisition time, increased standoff and the capability of acquisition of iris images from moving subjects.
  • Embodiments of a method for identifying an iris image can include obtaining an iris image of an eye, segmenting the iris image, generating, from the segmented iris image, a normalized iris image, and generating, from the normalized iris image, an iris template.
  • the method can also include generating a modified iris template by extracting a portion of the iris template, comparing the modified iris template with a plurality of previously stored other modified iris templates and matching the modified iris template with one of the plurality of previously stored other modified iris templates.
  • FIG. 1 is a schematic of an iris recognition system
  • FIG. 2A depicts an iris image with segmentation indicated
  • FIG. 2B depicts an iris image that has been normalized.
  • FIG. 3A depicts an original iris image
  • FIG. 3B depicts the iris image of FIG. 3A with segmentation indicated
  • FIG. 3C depicts a normalized image of the iris image of FIG. 3B .
  • FIG. 4 is a graphical display of two iris codes from the same eye and the difference between them.
  • a captured iris image is then segmented, normalized and an iris template (commonly called an iris code) is generated.
  • the template is then matched against an existing database of previously enrolled irises; a match against the database indicates the current iris is the same iris used to create the template that is in the database.
  • Segmentation identifies the boundaries of the pupil and iris. Normalization remaps the iris region between the pupil and the sclera (the white of the eye) using a transform that removes the pupil size variation.
  • FIGS. 2A and 2B are an example of a polar transformation described in U.S. Pat. No. 5,291,560 (issued to Daugman), as is known to those of skill in the art.
  • FIG. 2A depicts an iris image with segmentation 200 indicated.
  • FIG. 2B depicts a normalized image.
  • the normalized image has increasing radius 240 from top to bottom and increasing angle from left to right, as is known to those of skill in the art.
  • the horizontal streaks on the segmented image are an artifact of the specularity reduction algorithm used for this example.
  • An example of an iris code generated by performing a dot product between complex Gabor wavelets and the normalized image is herein described.
  • the phase angles of resulting complex dot products are then quantized to 2 bits and assembled into a bit array that comprises (with possibly additional information) an iris template.
  • iris template 400 contains a bit pattern of light and dark bits 402 , 404 that indicate the presence and location of iris contours.
  • the mask 410 also contains a light and dark bit pattern 411 , 412 indicating locations in the template 400 of relative high and low confidence of accuracy, as is known to those of skill in the art.
  • the comparison step computes the Hamming distance between the bit array of one template and that of another and compares that distance to a pre-determined threshold.
  • the Hamming distance is the number of bits that differ between the two bit arrays.
  • the fractional Hamming distance is the fraction of the bits that differ between the two templates.
  • Himming distance is conventionally used in this context to denote the fractional Hamming distance—and this convention is adopted for the purposes of this description. It has been has shown that Hamming distances less than 0.33 are statistically unlikely for templates that arise from different irises.
  • alternative methods for iris template generation for template comparison and methods for modification of conventional iris templates that enable improvements in the performance of iris recognition systems.
  • the alternative and modified templates and comparison methods can be useful in the construction of reference databases and in preparation of templates to be compared with the reference database.
  • a method for using a lower reliability method to search the iris template database for likely matches that can be subsequently confirmed with a higher reliability method such as, for example, a method practiced by Daugman. If the lower reliability method generates false matches at a rate PFM the computation cost of the new proposed method is:
  • O H is the computational cost for a single comparison of the high reliability method
  • O L is the computational cost for a single comparison of the low reliability method.
  • lower reliability methods involve extraction of a Modified Iris Template that contains a subset of the information contained in the original iris template. Matching this modified iris template against a correspondingly modified stored iris template will yield a partial measure of similarity that can provide lower reliability identity information.
  • Exemplary embodiments of such partial template methods that could be used with templates of the form used in Daugman-like methods include, but are not limited to: 1) row extraction and 2) column extraction.
  • Another embodiment of lower reliability match methods involves an indexing process. In this embodiment an indexible iris template with reduced degrees of freedom is extracted using error correcting code techniques.
  • Another embodiment of lower reliability match methods involves binning iris templates based on a similarity measure or characteristic metric that allows grouping templates of similar structure (i.e., binning methods).
  • a new template consisting of the data from a single row of the conventional template is constructed. If the template has N rows, comparison of such modified templates will be less computationally expensive by a factor of 1/N.
  • the rows could be combined into a single row by a generalized averaging technique, for example, a plurality voting or super-majority voting technique.
  • the probability of a 2.2 ⁇ or larger deviation from the mean is 1.5%.
  • the Gaussian approximation used here has larger tails than the binomial—this inflates the PFM estimate and thereby reduces the estimated improvement. According to this particular approach, the maximum improvement is a factor of eight. For example, a backend server of an iris recognition system requiring $1M worth of hardware using only the conventional high reliability method, may realize a cost reduction on the order of $125K using the method described above.
  • this embodiment there can be provided a step to decimate columns, keeping a constant angular distance between the remaining columns so one can do the barrel shift described above.
  • This embodiment of the method is less advantageous than the row extraction because it decreases the angular resolution of the barrel shift and does not make use of the correlation between code bits along the columns.
  • a typical iris template has of the order of 250 degrees of freedom.
  • the iris template generated during recognition will differ from the enrollment. However, if the error correction works—one can still recover the template index.
  • This embodiment may use the recovered index to pick the corresponding iris template from the database. With a sorted lookup table this is simply several memory access and compare operations. In the next step of this embodiment, the full iris templates are compared. When the index works, one saves a factor of approximately N (database size). If the index fails, the method falls back to any of the methods already described. If P IF is the probability that the index fails, the speedup of the index method is a factor [(1/N)+P IF ] over that of the fall back method alone.
  • the binning method generates a characteristic metric from the enrollment iris templates that can be used to partition the database of iris codes into R subsets that can be subsequently selected on the basis of the computation of the characteristic metric for an identification template.
  • a characteristic metric from the enrollment iris templates that can be used to partition the database of iris codes into R subsets that can be subsequently selected on the basis of the computation of the characteristic metric for an identification template.
  • one step could compute the energy spectrum of the enrollment iris templates and select the L largest peaks in the spectrum.
  • Another step could assemble the peak locations into L tables that each share an index with the table of full iris template and sort the L tables.
  • this embodiment may include a step of comparing an identification template against the database, which allows the method to compute its peak locations and do full Hamming distance comparisons of the identification template against all the enrollment templates that have a peak in one of the L tables that is within some delta of the peaks of the identification template.
  • the size of the delta and the range of the peaks location in this embodiment will define R—though the deltas could be non-uniform.
  • this embodiment of the method saves a factor of approximately R/L. If the binning fails, the steps could fall back to any of the methods already described.
  • the system In implementing one of these reduced size template approaches, in one embodiment, the system generates templates for the proposed search at the substantially the same time as the system generates the full templates and links them to the full templates in a database.
  • the reduced size templates may take up less memory, memory use reduction can enable maintenance of the entire database in memory, reducing the need for disk paging, another speed improvement over and above the simple analysis above.
  • Embodiments of the present invention include methods for improving capture of iris images to provide increased capture volume, decreased acquisition time, increased standoff and the capability of acquisition of iris images from moving subjects.
  • multiple images are combined to generate a template that is improved as compared to a template obtained from any of the individual images.
  • this can be achieved by generalized averaging of the one or more of following items which are generated at various stages of the template generation process: 1) the iris image; 2) the segmented iris image; 3) the normalized iris image; and 4) the biometric template.
  • information from the later stages of the template generation process is used to guide averaging at the earlier stages of the process.
  • the stages are illustrated in FIGS. 3A-C .
  • the mask bits in the template indicate those regions of the template—and the corresponding locations in the images—where the image was of sufficient quality to get good code bits.
  • FIG. 3A shows an original iris image 300 .
  • FIG. 3B is an iris image with segmentation indicated 310 .
  • the horizontal streaks on the segmented image are an artifact of the specularity reduction algorithm used for this example.
  • FIG. 3C is a normalized image 320 .
  • the normalized image has increasing radius from top to bottom and increasing angle from left to right
  • a typical template includes 2048 code bits and an equal number of mask bits. As such, for good quality iris images approximately 1500 of the mask bits are set; hence, there are approximately 500 bits in the code sections that are flagged as not valid.
  • FIG. 4 illustrates a comparison 450 of the two codes.
  • the barrel shift that minimizes the Hamming distance there are 374 bits (452) in the difference between the masks.
  • the mask bits are an accurate representation of the quality of the code bits, one can improve the number of valid bits in a new template for this eye by performing the following steps: 1.) barrel shifting one of the templates to align with the other using Hamming distance as the criterion for best shift; 2) creating a new blank template with all mask bits and code bits turned off; 3.) for each location in the pair of original, aligning the templates and checking the mask bits.
  • both mask bits are set, then check the code bits. If they agree/match, the corresponding code bit of the new template may be set to that value and the corresponding mask bit of the new template may be set. If the code bits do not agree, the corresponding mask bit of the new template may be cleared, since the value of the code bit is not known.
  • the corresponding code bit from the template that has the mask bit set is selected and the corresponding bit of the new template is set to the value of the code bit in the template with the valid mask bit.
  • the system can 1.) align all of templates; and 2.) for each bit location, take a vote among all of the templates for which the mask bit of that location is set, and set the corresponding bit of the new template to the value determined by the outcome of the vote. Further, the mask bit of the new template is set.
  • the vote criteria can be varied depending on applications. For some applications a simple plurality may be sufficient, for others a super-majority or some other more complex vote criteria may be optimal. If the vote is not decisive, for example the voting does not provide a required super majority for either bit state, the mask bit would be cleared to indicate that that bit is not reliable.
  • the mask and code bits of the new template should be cleared.
  • the templates are aligned to the template with the most valid bits.
  • An optimization of the alignment procedures is also contemplated by the present invention. One possible optimization is to average the two best templates using the two template method described above, then to incrementally add in the other templates, one by one to the averaged template.
  • One embodiment of the present invention relating to the averaging of initial iris images is to include proper alignment and scale.
  • the centers and radii of the pupil and iris may be computed.
  • information about the relative rotation of the iris is generated. In another embodiment having two images, this information can be used as follows:
  • the user would want to pick the images with the most mask bits set as the alignment image.
  • the averaged image can then be used to generate an enhanced template.
  • the averaged image can then be used to generate an enhanced template.
  • Enhanced iris templates generated using this technique can be used in place of the standard iris templates, known in the art, both as templates to be identified and templates to be stored in a reference database.
  • Analysis of an annular object such as a human iris can be facilitated by transformation of the image in rectangular co-ordinates to an image in polar coordinates as shown in FIGS. 2A and 2B .
  • a method for transforming images of the eye taken at near normal incidence using a rectangular to polar transformation is provided.
  • a transformation into an alternative co-ordinate system may be used.
  • the present invention contemplates biometric recognition for all beings having irises. Tracking the identity of livestock, zoo animals, pets and racing animals are potential applications of iris recognition. Some non-human species have eyes with irises or pupils that are non-circular. In these cases, a transformation into an alternative co-ordinate system may be preferable.
  • the pupil of a cat is not circularly symmetric.
  • the cat iris is more similar to a slit than a circular hole.
  • an ellipse is a better model for the shape of the pupil than is a circle and a form of elliptical coordinates as described in “ Mathematical Methods for Physicists ,” by G. Arfken, (Academic Press, NY), may be a better coordinate system.
  • other classic coordinate systems described in Arken and references cited therein may be useful. More complicated non-classic coordinate systems may also be useful.
  • a physiological/mechanical model of pupil dilation to construct a specialized coordinate system in which the pupil dilation can be easily normalized.
  • One embodiment includes a method for transformation from the rectangular co-ordinate system of the image to another rectangular co-ordinate system that takes into account the projection and a subsequent transformation to a coordinate system appropriate for the eye structure. After the coordinate transformation(s), a further step may include applying other transformations to the image.
  • the method may include an evaluation of the original image or the normalized image to determine parts of the image not suitable for analysis for reasons such as specularities or occlusion.
  • Normalized images can be directly compared by correlation analysis. Given two normalized images, A and B, subtract off their respective means and then compute the following formula:
  • Pixels might be excluded because they are part of a specularity in the iris or an occlusion of the iris.
  • the direct correlation method is presented first because it is a prototype for the approaches that follow.
  • the direct correlation method there is a comparison of images rather than some more compact representation of the images.
  • Compact representations have advantages for speed of comparison and size of storage for the templates.
  • the more compact representations are divided into phase like, amplitude like and other, with sub divisions of local and global.
  • the B image in the direct correlation example is replaced with two images; one has sine like character, the other has cosine like character.
  • Csine and Ccosine are computed as a function of barrel shift (excluding unsuitable pixels as discussed above). For each barrel shift, an angle ⁇ is computed from the four quadrant arc-cosine: acos(Csine, Ccosine).
  • Function ⁇ (barrel shift) is generated and is periodic in barrel shift, that is representative of the image A, and which can be compared with ⁇ 's extracted from other normalized iris images to determine similarity of the normalized iris images.
  • Exemplary comparison methods suitable for use with present invention include, but are not limited to: 1) normalized correlation of the ⁇ 's; and 2) quantization of the ⁇ 's and comparison of the resulting bit streams via Hamming distance.
  • the B image is replaced in the direct correlation example with a collection of 2N images; composed of image pairs one has sine like character the other has cosine like character.
  • Each of the N pairs is designed to interrogate a specified region of the A image.
  • Each of the specified regions may be an arbitrary region or arbitrary collection of arbitrary sub-regions.
  • Csine(i) and Ccosine(i) are computed for each of the N pairs; i is the location index (excluding unsuitable pixels as discussed above). For each pair, an angle ⁇ i is computed from the four quadrant arc-cosine: acos(Csine, Ccosine).
  • a function ⁇ (l) is calculated that is representative of the image A, and which can be compared with ⁇ (i)'s extracted from other normalized iris images to determine similarity of the normalized iris images.
  • the number of pixels included for each “i” may be kept track of and locations that do not have sufficient support or fail some other criteria for quality are excluded.
  • Comparison may be more complicated than in the global measure case depending on the symmetry of the N pairs. If the N pairs are generated by successive barrel shifts of the first pair, the following exemplary processes may be used in accordance with the present invention: 1) normalized correlation of the ⁇ 's; and 2) quantization of the ⁇ 's and comparison of the resulting bit streams via Hamming distance with barrel shifts of the bit streams.
  • regions for the first pair of a set of N barrel shifted pairs which may be used in accordance with the present invention include, but are not limited to: 1) a single vertical (radial) region; 2) a group of isolated regions arranged vertically (radially); 3) a single diagonal (spiral) region; and 4) a group of isolated regions arranged diagonally (spirally).
  • sine-like and cosine-like image pairs can be replaced in the analyses above with single images and process the iris images in the same way except use the correlation, C, rather than the acos(Csine, Ccosine).
  • these techniques can be combined in a variety of ways including, but not limited to: 1) using an amplitude or phase like local or global measure that is tuned for extraction of the barrel shift needed to align two normalized images to align a pair of normalized images and then perform a direct correlation analysis of the images; 2) using a low resolution, high false match rate local or global, amplitude or phase like measure to select candidates for direct correlation analysis of the normalized images; and/or 3) using a low resolution, high false match rate local or global, amplitude or phase like measure to select candidates and define the barrel shift for angular registration of the images. Amplitude or phase like local measures can then be computed at a collection of sites on any arbitrary matrix imposed on the registered normalized images.
  • two normalized images can be aligned using FFT techniques—transform both images via FFT, take the product, normalize appropriately, transform back and then find the peak in the resulting cross correlation function.

Abstract

A method for identifying an iris image can include obtaining an iris image of an eye, segmenting the iris image, generating, from the segmented iris image, a normalized iris image, and generating, from the normalized iris image, an iris template. The method can also include generating a modified iris template by extracting a portion of the iris template, comparing the modified iris template with a plurality of previously stored other modified iris templates and matching the modified iris template with one of the plurality of previously stored other modified iris templates. The method can also include generating a modified iris template by extracting a portion of the iris template, comparing the modified iris template with a plurality of previously stored other modified iris templates, and matching the modified iris template with one of the plurality of previously stored other modified iris templates.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation application of U.S. Non-Provisional Patent Application No. 11/510,197 filed on Aug. 25, 2006, which claims the benefit of U.S. Provisional Application No. 60/711,105, filed Aug. 25, 2005, U.S. Provisional Application No. 60/711,106, filed Aug. 25, 2005, and U.S. Provisional Application No. 60/711,107, filed Aug. 25, 2005. The entire disclosures of U.S. Non-Provisional Patent Application No. 11/510,197, and U.S. Provisional Application Serial Nos. 60/711,105, 60/711,106, and 60/711,107 are incorporated by reference herein.
  • FIELD OF THE INVENTION
  • Embodiments of the present invention generally relate to biometric identification systems and in particular to biometric identification systems using iris recognition methods.
  • BACKGROUND OF THE INVENTION
  • Iris recognition is one of the most powerful techniques for biometric identification ever developed. Commercial systems, like those based on an algorithm developed by Daugman, (see U.S. Pat. No. 5,291,560, the contents of which are hereby incorporated by reference herein) have been available since 1995 and have been used in a variety of practical applications.
  • The basic principles of iris recognition are summarized in FIG. 1. The subject iris 10 is illuminated with light from controlled and ambient light sources 12. The camera 14 and the controlled illumination 16 are at some defined standoff distance 18, 20, from the subject. The camera and lens 14 (possibly filtered) acquires an image that is then captured by a computer 22. The iris image 24 is then segmented, normalized and an iris template (commonly called an iris code) 28 is generated. Segmentation identifies the boundaries of the pupil and iris. Normalization remaps the iris region between the pupil and the sclera (the white of the eye) into a form convenient for template generation and removes the effects of pupil dilation using a suitable model. The template is then matched against an existing database 30 of previously enrolled irises; a match against the database indicates the current iris is the same iris used to create the template in the database.
  • However, prior iris recognition techniques suffer from several difficulties for some applications. One difficulty is that when used in systems with large (e.g., 500,000) people in the database, such systems require a large number of servers to accommodate the large number of people in the databases.
  • Another difficulty is that prior techniques have been designed and optimized for applications in which the false match rate is of paramount importance—neglecting applications in which other factors are more important than the false match rate or in which other engineering tradeoffs should be considered. In addition, these techniques expect a reasonably high quality image. There are applications where one would accept a higher false match rate in return for the ability to use lower quality images. Forensic applications are one example. Acquisition of images for iris recognition in less constrained environments is another example.
  • Acquisition of high quality iris images is difficult because the human iris is a small target (−1 cm diameter), with relatively low albedo (˜0.15), in the near IR. Existing iris recognition algorithms recommend a resolution of the order of 200 pixels across the iris. Hence, acquisition of iris images of sufficient quality for iris recognition is challenging, particularly from a distance. Current commercially available iris cameras require substantial cooperation on the part of the subject. Two simple metrics for the required degree of cooperation are the capture volume and the residence time in the capture volume. The capture volume is the three dimensional volume into which an eye must be placed and held for some period of time (the residence time) in order for the system to reliably capture an iris image of sufficient quality for iris recognition. For ease of use, the user would want the spatial extent of the capture volume to be as large as possible and the residence time to be as small as possible.
  • A related issue is the standoff distance, the distance between the subject and the iris image acquisition system. Existing systems require reasonably close proximity, in some cases an ‘in your face’ proximity. Existing iris recognition algorithms are generally good enough for most applications in which the subject is sufficiently cooperative. A challenge resides in reducing constraints on the subject so iris recognition is easier to use.
  • In scenarios in which iris recognition needs to be deployed in less constrained environments, we can reasonably expect that the acquired iris images will be of lower quality than those in highly constrained environments. Hence, there is a need for algorithms that can work with lower quality images, even at the possible expense of higher false match and/or false non-match rates.
  • Current iris recognition algorithms search a template database by brute force. The algorithms used are very efficient, but they conduct the search by systematically testing against every template in the database until a match is found. For large databases that are subject to high interrogation rates this consumes many CPU cycles and requires deployment of large collections of computers to provide acceptable response rates. Thus, there is a need for a method that can improve the search rate over existing algorithms and one that will decrease the equipment costs for the database searches. This will become particularly important as high throughput iris recognition systems become widely deployed.
  • Thus, there are multiple reasons that we need improvements to existing iris recognition methods, including alternatives to the existing methods.
  • SUMMARY OF THE INVENTION
  • Embodiments of the present invention address these and other needs and generally relate to methods for improving the acquisition of iris images, improving methods of generating iris templates and for improving methods for searching template databases.
  • Embodiments of the present invention generally relate to methods for improving performance of an iris recognition system. The methods can include using iris image registration and averaging steps to improve performance. In one embodiment, the iris image is averaged. In another embodiment, the segmented iris image is averaged. In yet another embodiment, the normalized iris image is averaged. In another embodiment, the biometric template is averaged. As used herein, the term ‘averaged’ denotes a combining of two or more entities, such as iris templates, based on a set of rules based on information that can be contained both within and external to the entity, or iris template. The term average, as used herein, is not restricted to the narrower concept of an arithmetic mean. The purpose of the averaging is to create an entity which is better than any one of the individual entities that have been averaged.
  • Embodiments of the present invention include methods for iris recognition in which false match rates are balanced against other performance parameters.
  • In one embodiment, there is provided an iris recognition method for capturing iris images that provide increased capture volume, decreased acquisition time, increased standoff and the capability of acquisition of iris images from moving subjects.
  • Embodiments of a method for identifying an iris image can include obtaining an iris image of an eye, segmenting the iris image, generating, from the segmented iris image, a normalized iris image, and generating, from the normalized iris image, an iris template. The method can also include generating a modified iris template by extracting a portion of the iris template, comparing the modified iris template with a plurality of previously stored other modified iris templates and matching the modified iris template with one of the plurality of previously stored other modified iris templates.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments of the present invention will be more readily understood from the detailed description of exemplary embodiments presented below considered in conjunction with the attached drawings, of which:
  • FIG. 1 is a schematic of an iris recognition system;
  • FIG. 2A depicts an iris image with segmentation indicated;
  • FIG. 2B depicts an iris image that has been normalized.
  • FIG. 3A depicts an original iris image;
  • FIG. 3B depicts the iris image of FIG. 3A with segmentation indicated;
  • FIG. 3C depicts a normalized image of the iris image of FIG. 3B; and
  • FIG. 4 is a graphical display of two iris codes from the same eye and the difference between them.
  • It is to be understood that the attached drawings are for purposes of illustrating the concepts of the invention and may not be to scale.
  • DETAILED DESCRIPTION OF THE INVENTION
  • As described above with reference to FIG. 1, a captured iris image is then segmented, normalized and an iris template (commonly called an iris code) is generated. The template is then matched against an existing database of previously enrolled irises; a match against the database indicates the current iris is the same iris used to create the template that is in the database.
  • Segmentation identifies the boundaries of the pupil and iris. Normalization remaps the iris region between the pupil and the sclera (the white of the eye) using a transform that removes the pupil size variation. FIGS. 2A and 2B are an example of a polar transformation described in U.S. Pat. No. 5,291,560 (issued to Daugman), as is known to those of skill in the art.
  • FIG. 2A depicts an iris image with segmentation 200 indicated. FIG. 2B depicts a normalized image. For this example, the normalized image has increasing radius 240 from top to bottom and increasing angle from left to right, as is known to those of skill in the art. The horizontal streaks on the segmented image are an artifact of the specularity reduction algorithm used for this example.
  • An example of an iris code generated by performing a dot product between complex Gabor wavelets and the normalized image is herein described. The phase angles of resulting complex dot products are then quantized to 2 bits and assembled into a bit array that comprises (with possibly additional information) an iris template.
  • With reference to FIG. 4, iris template 400 contains a bit pattern of light and dark bits 402, 404 that indicate the presence and location of iris contours. The mask 410 also contains a light and dark bit pattern 411, 412 indicating locations in the template 400 of relative high and low confidence of accuracy, as is known to those of skill in the art.
  • The comparison step computes the Hamming distance between the bit array of one template and that of another and compares that distance to a pre-determined threshold. The Hamming distance is the number of bits that differ between the two bit arrays. The fractional Hamming distance is the fraction of the bits that differ between the two templates. The term “Hamming distance” is conventionally used in this context to denote the fractional Hamming distance—and this convention is adopted for the purposes of this description. It has been has shown that Hamming distances less than 0.33 are statistically unlikely for templates that arise from different irises. Some matching algorithms adjust the raw Hamming distance based upon the number of bits available in the templates and the size of the enrolled database to produce a modified Hamming distance, as is known to those of skill in the art.
  • In accordance with embodiments of the present invention, there are provided alternative methods for iris template generation, for template comparison and methods for modification of conventional iris templates that enable improvements in the performance of iris recognition systems. The alternative and modified templates and comparison methods can be useful in the construction of reference databases and in preparation of templates to be compared with the reference database.
  • In accordance with embodiments of the present invention, there is provided a method for using a lower reliability method to search the iris template database for likely matches that can be subsequently confirmed with a higher reliability method, such as, for example, a method practiced by Daugman. If the lower reliability method generates false matches at a rate PFM the computation cost of the new proposed method is:

  • CCnew=P FM NO H +NO L =N(P FM O H +O L)
  • where OH is the computational cost for a single comparison of the high reliability method and OL is the computational cost for a single comparison of the low reliability method.—The computational cost using only the high reliability method is simply:

  • CCH=NOH
  • Hence the ratio of the costs is expressed as:

  • CC new /CC H=(P FM O H +O L)/O H =P FM+ O L /O H
  • Clearly, if OL<<OH and PFM<<1, the computational cost of the new method can be much smaller than applying the high reliability method alone.
  • Several embodiments of lower reliability methods involve extraction of a Modified Iris Template that contains a subset of the information contained in the original iris template. Matching this modified iris template against a correspondingly modified stored iris template will yield a partial measure of similarity that can provide lower reliability identity information. Exemplary embodiments of such partial template methods that could be used with templates of the form used in Daugman-like methods include, but are not limited to: 1) row extraction and 2) column extraction. Another embodiment of lower reliability match methods involves an indexing process. In this embodiment an indexible iris template with reduced degrees of freedom is extracted using error correcting code techniques.
  • Another embodiment of lower reliability match methods involves binning iris templates based on a similarity measure or characteristic metric that allows grouping templates of similar structure (i.e., binning methods).
  • Row Extraction
  • According to an embodiment of the present invention, a new template consisting of the data from a single row of the conventional template is constructed. If the template has N rows, comparison of such modified templates will be less computationally expensive by a factor of 1/N. As an alternative to extracting a single row, the rows could be combined into a single row by a generalized averaging technique, for example, a plurality voting or super-majority voting technique.
  • One can estimate the probability of obtaining a false match using the low reliability method (PFM) in the following way. As suggested by Daugman, model the matching process using a binomial distribution with p=q=0.5. It has been shown that the number of degrees of freedom in the iris is of the order of n=250. Approximate the binomial distribution of normalized Hamming distances (fraction of bits set) by a Gaussian of mean 0.5 and standard deviation 0.5/√{square root over (n)}. The nominal cutoff for recognition in existing systems is a Hamming distance of 0.3—which differs from the mean by 0.2. For n=250, the standard deviation, is ˜0.032, so the difference is approximately 6 standard deviations. The probability of an event that is 6 σ or larger is approximately 1:109. If one reduces the effective n by reducing the number of bits that are analyzed, one can increase the probability of getting a match between two unrelated templates.
  • If the reduction is 8 fold, the standard deviation will be σ=0.5/√{square root over ((250/8))} or about 0.09—and the cutoff is now 2.2 σ from the mean. The probability of a 2.2 σ or larger deviation from the mean is 1.5%.
  • As such, the following expression may be derived:

  • CC new=(0.125+0.015)CC H=0.14CC H
  • This is a factor of approximately 7 times faster than the conventional approach. According to an embodiment of the present invention, improved results are achieved by this method because the bits in the iris code are correlated along the vertical directions of the graphical display, corresponding to radial correlations. Hence, taking out a single row from a template of N rows captures more than 1/N of the degrees of freedom in the code.
  • The Gaussian approximation used here has larger tails than the binomial—this inflates the PFM estimate and thereby reduces the estimated improvement. According to this particular approach, the maximum improvement is a factor of eight. For example, a backend server of an iris recognition system requiring $1M worth of hardware using only the conventional high reliability method, may realize a cost reduction on the order of $125K using the method described above.
  • Column Extraction
  • In this embodiment, there can be provided a step to decimate columns, keeping a constant angular distance between the remaining columns so one can do the barrel shift described above. This embodiment of the method is less advantageous than the row extraction because it decreases the angular resolution of the barrel shift and does not make use of the correlation between code bits along the columns.
  • Index Methods
  • A typical iris template has of the order of 250 degrees of freedom. One can use some fraction of those degrees of freedom to generate an error corrected reduced iris template of fewer bits at enrollment and store the error corrected value as an index that can be used as a hash table lookup into the database of iris templates. The iris template generated during recognition will differ from the enrollment. However, if the error correction works—one can still recover the template index.
  • This embodiment may use the recovered index to pick the corresponding iris template from the database. With a sorted lookup table this is simply several memory access and compare operations. In the next step of this embodiment, the full iris templates are compared. When the index works, one saves a factor of approximately N (database size). If the index fails, the method falls back to any of the methods already described. If PIF is the probability that the index fails, the speedup of the index method is a factor [(1/N)+PIF] over that of the fall back method alone.
  • Binning Methods
  • In accordance with an embodiment, the binning method generates a characteristic metric from the enrollment iris templates that can be used to partition the database of iris codes into R subsets that can be subsequently selected on the basis of the computation of the characteristic metric for an identification template. One can use any of a large variety of characteristics of the iris code.
  • As an example, in accordance with an embodiment of the present invention, one step could compute the energy spectrum of the enrollment iris templates and select the L largest peaks in the spectrum. Another step could assemble the peak locations into L tables that each share an index with the table of full iris template and sort the L tables.
  • In addition, this embodiment may include a step of comparing an identification template against the database, which allows the method to compute its peak locations and do full Hamming distance comparisons of the identification template against all the enrollment templates that have a peak in one of the L tables that is within some delta of the peaks of the identification template. The size of the delta and the range of the peaks location in this embodiment will define R—though the deltas could be non-uniform.
  • When the binning works, this embodiment of the method saves a factor of approximately R/L. If the binning fails, the steps could fall back to any of the methods already described.
  • In implementing one of these reduced size template approaches, in one embodiment, the system generates templates for the proposed search at the substantially the same time as the system generates the full templates and links them to the full templates in a database. The reduced size templates may take up less memory, memory use reduction can enable maintenance of the entire database in memory, reducing the need for disk paging, another speed improvement over and above the simple analysis above.
  • Embodiments of the present invention include methods for improving capture of iris images to provide increased capture volume, decreased acquisition time, increased standoff and the capability of acquisition of iris images from moving subjects. In one embodiment, multiple images are combined to generate a template that is improved as compared to a template obtained from any of the individual images. According to an embodiment of the present invention, this can be achieved by generalized averaging of the one or more of following items which are generated at various stages of the template generation process: 1) the iris image; 2) the segmented iris image; 3) the normalized iris image; and 4) the biometric template.
  • In addition, in one embodiment, information from the later stages of the template generation process is used to guide averaging at the earlier stages of the process. The stages are illustrated in FIGS. 3A-C. The mask bits in the template indicate those regions of the template—and the corresponding locations in the images—where the image was of sufficient quality to get good code bits.
  • FIG. 3A shows an original iris image 300. FIG. 3B is an iris image with segmentation indicated 310. The horizontal streaks on the segmented image are an artifact of the specularity reduction algorithm used for this example. FIG. 3C is a normalized image 320. For this example, the normalized image (FIG. 3C) has increasing radius from top to bottom and increasing angle from left to right
  • Averaging Iris Templates
  • According to an embodiment of the present invention, methods are provided wherein two templates are compared by computing the Hamming distance between the code portions of the template as a function of barrel shift along the horizontal (angular) axis of the templates. Only the bits flagged as good in both masks are included in the comparison. The barrel shift takes into account angular rotation of the eye between the acquisitions of the images for the two templates. For example, a typical template includes 2048 code bits and an equal number of mask bits. As such, for good quality iris images approximately 1500 of the mask bits are set; hence, there are approximately 500 bits in the code sections that are flagged as not valid.
  • FIG. 4 illustrates a comparison 450 of the two codes. At the barrel shift that minimizes the Hamming distance, there are 374 bits (452) in the difference between the masks. Assuming the mask bits are an accurate representation of the quality of the code bits, one can improve the number of valid bits in a new template for this eye by performing the following steps: 1.) barrel shifting one of the templates to align with the other using Hamming distance as the criterion for best shift; 2) creating a new blank template with all mask bits and code bits turned off; 3.) for each location in the pair of original, aligning the templates and checking the mask bits.
  • If both mask bits are set, then check the code bits. If they agree/match, the corresponding code bit of the new template may be set to that value and the corresponding mask bit of the new template may be set. If the code bits do not agree, the corresponding mask bit of the new template may be cleared, since the value of the code bit is not known.
  • If neither mask bit is set, no action is taken, and the mask and code bits of the new template are cleared because no information for this code bit is known.
  • If one mask bit is set, the corresponding code bit from the template that has the mask bit set is selected and the corresponding bit of the new template is set to the value of the code bit in the template with the valid mask bit.
  • In accordance with another embodiment, if there are more than two templates, the system can 1.) align all of templates; and 2.) for each bit location, take a vote among all of the templates for which the mask bit of that location is set, and set the corresponding bit of the new template to the value determined by the outcome of the vote. Further, the mask bit of the new template is set. The vote criteria can be varied depending on applications. For some applications a simple plurality may be sufficient, for others a super-majority or some other more complex vote criteria may be optimal. If the vote is not decisive, for example the voting does not provide a required super majority for either bit state, the mask bit would be cleared to indicate that that bit is not reliable.
  • If, for some bit location, none of the templates has a corresponding mask bit set, the mask and code bits of the new template should be cleared.
  • In accordance with an embodiment, the templates are aligned to the template with the most valid bits. An optimization of the alignment procedures is also contemplated by the present invention. One possible optimization is to average the two best templates using the two template method described above, then to incrementally add in the other templates, one by one to the averaged template.
  • Averaging Initial Iris Images
  • One embodiment of the present invention relating to the averaging of initial iris images is to include proper alignment and scale. In the course of generating the iris templates, the centers and radii of the pupil and iris may be computed. In the course of a comparison of iris templates, information about the relative rotation of the iris is generated. In another embodiment having two images, this information can be used as follows:
      • 1. From the radii computed for the two images, compute a scaling factor and scale the second image so its radii match those of the first;
      • 2. From the centers information of the two images, compute x and y offsets and shift the images in x and y to match the centers;
      • 3. From the barrel shift computed from the templates of the two images, compute a rotation and rotate the second image about the pupil center to align it with the first; and/or
      • 4. Average the aligned images.
  • In an embodiment having more than two images, align them all to one image and average. In another embodiment, the user would want to pick the images with the most mask bits set as the alignment image. The averaged image can then be used to generate an enhanced template.
  • Averaging Segmented Iris Images
  • With access to the segmented iris images, in accordance with another embodiment, one can perform an initial alignment as described in the previous section and then employ correlation (or other) based image alignment techniques to refine the alignment between the two images—only utilizing portions of the image associated with the iris. The averaged image can then be used to generate an enhanced template.
  • Averaging Normalized Iris Images
  • With access to the normalized images, in accordance with another embodiment, one can use correlation (or other) based image alignment techniques to align the normalized image along the angular (horizontal) axis and then average the normalized images. With access to the normalized images, one can use the barrel shift information from the template comparison to align the normalized image along the angular (horizontal) axis and then average the normalized images. Alternatively, one can align the normalized images using the barrel shift information and then refine the alignment using correlation (or other) based image alignment techniques. Enhanced iris templates generated using this technique (including any of the exemplary embodiments) can be used in place of the standard iris templates, known in the art, both as templates to be identified and templates to be stored in a reference database.
  • The embodiment of using information from various stages of the process to assist in alignment and averaging at other stages of the process is an important aspect of embodiments of the present invention.
  • Image Transformation and Normalization
  • Analysis of an annular object such as a human iris can be facilitated by transformation of the image in rectangular co-ordinates to an image in polar coordinates as shown in FIGS. 2A and 2B. In one embodiment, there is provided a method for transforming images of the eye taken at near normal incidence using a rectangular to polar transformation.
  • In another embodiment, in situations in which a polar transformation is not the best model, a transformation into an alternative co-ordinate system may be used. The present invention contemplates biometric recognition for all beings having irises. Tracking the identity of livestock, zoo animals, pets and racing animals are potential applications of iris recognition. Some non-human species have eyes with irises or pupils that are non-circular. In these cases, a transformation into an alternative co-ordinate system may be preferable.
  • For example, the pupil of a cat is not circularly symmetric. The cat iris is more similar to a slit than a circular hole. Hence, an ellipse is a better model for the shape of the pupil than is a circle and a form of elliptical coordinates as described in “Mathematical Methods for Physicists,” by G. Arfken, (Academic Press, NY), may be a better coordinate system. Depending on the details of the iris and pupil structure, other classic coordinate systems described in Arken and references cited therein may be useful. More complicated non-classic coordinate systems may also be useful. For example, we may use a physiological/mechanical model of pupil dilation to construct a specialized coordinate system in which the pupil dilation can be easily normalized. In some cases, we may model pupil dilation using elastic theory, in such cases, the Schwarz-Christoffel transformation and related transformations from complex variable theory may be used.
  • When an image of the eye is captured from an angle relative to the optic axis of the eye, the foreshortening effect changes the apparent shape of the iris—It is no longer circular. In this case, a transformation into an alternative co-ordinate system may be preferable. One embodiment includes a method for transformation from the rectangular co-ordinate system of the image to another rectangular co-ordinate system that takes into account the projection and a subsequent transformation to a coordinate system appropriate for the eye structure. After the coordinate transformation(s), a further step may include applying other transformations to the image.
  • For example, there may be a step to histogram equalize or gamma correct the image. The result of the coordinate transformation(s) and whatever other transformations applied is a normalized image ready for subsequent analysis. In an alternative embodiment, the method may include an evaluation of the original image or the normalized image to determine parts of the image not suitable for analysis for reasons such as specularities or occlusion.
  • Analysis of the Normalized Image
  • There are many embodiments of methods in accordance with the present invention for analysis of the transformed image. Here are a few examples: Direct Normalized Correlation of Normalized Images:
  • Normalized images can be directly compared by correlation analysis. Given two normalized images, A and B, subtract off their respective means and then compute the following formula:

  • C=A•B/√(A•A*B•B)
  • as a function of barrel shift along the polar axis of the normalized image and select the maximum. In this formula, • indicates a dot product. The maximum C is the normalized correlation for the images and is a direct measure of similarity of the images. This process also determines the degree of rotation between the two images—it is simply the barrel shift at the maximum of C. This process can be reasonably fast because modern image processing boards often have specialized circuits to carry out this operation at high speed. For a normalized image N pixels high and M pixels wide, the computational cost is of the order of NM2; note the polar nature of the co-ordinate system relieves difficulties at the horizontal edges of the image.
  • If meta-data are included for the image that indicates some pixels in the image are not suitable for analysis, those pixels can be excluded from the analysis. Pixels might be excluded because they are part of a specularity in the iris or an occlusion of the iris.
  • The direct correlation method is presented first because it is a prototype for the approaches that follow. In the direct correlation method, there is a comparison of images rather than some more compact representation of the images. Compact representations have advantages for speed of comparison and size of storage for the templates. The more compact representations are divided into phase like, amplitude like and other, with sub divisions of local and global.
  • Extraction of Phase Like Global Measures;
  • In this embodiment, the B image in the direct correlation example is replaced with two images; one has sine like character, the other has cosine like character. Csine and Ccosine are computed as a function of barrel shift (excluding unsuitable pixels as discussed above). For each barrel shift, an angle σ is computed from the four quadrant arc-cosine: acos(Csine, Ccosine). Function σ (barrel shift) is generated and is periodic in barrel shift, that is representative of the image A, and which can be compared with σ's extracted from other normalized iris images to determine similarity of the normalized iris images.
  • Exemplary comparison methods suitable for use with present invention include, but are not limited to: 1) normalized correlation of the σ's; and 2) quantization of the σ's and comparison of the resulting bit streams via Hamming distance.
  • Extraction of Phase Like Local Measures
  • In this embodiment of the present invention, the B image is replaced in the direct correlation example with a collection of 2N images; composed of image pairs one has sine like character the other has cosine like character. Each of the N pairs is designed to interrogate a specified region of the A image. Each of the specified regions may be an arbitrary region or arbitrary collection of arbitrary sub-regions. Csine(i) and Ccosine(i) are computed for each of the N pairs; i is the location index (excluding unsuitable pixels as discussed above). For each pair, an angle σ i is computed from the four quadrant arc-cosine: acos(Csine, Ccosine). A function σ (l) is calculated that is representative of the image A, and which can be compared with σ (i)'s extracted from other normalized iris images to determine similarity of the normalized iris images. The number of pixels included for each “i” may be kept track of and locations that do not have sufficient support or fail some other criteria for quality are excluded.
  • Comparison may be more complicated than in the global measure case depending on the symmetry of the N pairs. If the N pairs are generated by successive barrel shifts of the first pair, the following exemplary processes may be used in accordance with the present invention: 1) normalized correlation of the σ's; and 2) quantization of the σ's and comparison of the resulting bit streams via Hamming distance with barrel shifts of the bit streams.
  • Examples of regions for the first pair of a set of N barrel shifted pairs which may be used in accordance with the present invention include, but are not limited to: 1) a single vertical (radial) region; 2) a group of isolated regions arranged vertically (radially); 3) a single diagonal (spiral) region; and 4) a group of isolated regions arranged diagonally (spirally).
  • Generation of regions with symmetries other than the barrel shift symmetry can provide other means for simplifying the comparison.
  • Extraction of Amplitude Like Local or Global Measures
  • In an alternative embodiment, the sine-like and cosine-like image pairs can be replaced in the analyses above with single images and process the iris images in the same way except use the correlation, C, rather than the acos(Csine, Ccosine).
  • Mixed Modes
  • In an alternative embodiment, these techniques can be combined in a variety of ways including, but not limited to: 1) using an amplitude or phase like local or global measure that is tuned for extraction of the barrel shift needed to align two normalized images to align a pair of normalized images and then perform a direct correlation analysis of the images; 2) using a low resolution, high false match rate local or global, amplitude or phase like measure to select candidates for direct correlation analysis of the normalized images; and/or 3) using a low resolution, high false match rate local or global, amplitude or phase like measure to select candidates and define the barrel shift for angular registration of the images. Amplitude or phase like local measures can then be computed at a collection of sites on any arbitrary matrix imposed on the registered normalized images.
  • One having ordinary skill in the art will appreciate that alternative alignment methods may also be included. For example, two normalized images can be aligned using FFT techniques—transform both images via FFT, take the product, normalize appropriately, transform back and then find the peak in the resulting cross correlation function.
  • It is to be understood that the exemplary embodiments are merely illustrative of the invention and that many variations of the above-described embodiments can be devised by one skilled in the art without departing from the scope of the invention. It is therefore intended that all such variations be included within the scope of the following claims and their equivalents.

Claims (9)

1. A method for identifying an iris image, the method comprising the steps of:
obtaining an iris image of an eye;
generating, from the iris image, an iris template;
generating at least one characteristic metric from at least one of the iris template, the iris image and an intermediate product of a process of generating the iris template;
assigning the iris template to one of a plurality of categories based on the at least one characteristic metric; and
matching the iris template with one of a plurality of other iris templates stored in the assigned category.
2. The method of claim 1, wherein the at least one characteristic metric is derived from the energy spectrum of the iris template.
3. A method for building an iris template database, the method comprising the steps of:
obtaining a plurality iris images of eyes;
generating, from each iris image, an iris template;
generating at least one characteristic metric from at least one of the iris template, iris image, and an intermediate product of a process of generating the iris template;
assigning each iris template to one of a plurality of categories based on the at least one characteristic metric; and
storing each iris template in an data structure searchable based on the category.
4. The method of claim 3, wherein the at least one characteristic metric is derived from the energy spectrum of the iris template.
5. A method for identifying an iris image, the method comprising the steps of:
obtaining an iris image of an eye;
generating, from the iris image, an original iris template;
generating, from the original iris template, an error-corrected reduced iris template;
comparing the error-corrected iris template with a plurality of previously stored other error-corrected reduced iris templates to select at least one candidate for full unmodified template matching with the original template; and
comparing the original iris template with the corresponding other previously stored original iris templates of the selected at least one candidate.
6. A method for building an iris template database, the method comprising the steps of:
obtaining a plurality iris images of eyes;
generating, from each iris image, an iris template;
generating an error corrected reduced iris template from each iris template;
storing each reduced iris template in an indexed data structure allowing efficient access based on the content of the error corrected reduced iris template; and
associating with each entry in this data structure the corresponding original iris template.
7. A method for identifying an iris image, the method comprising the steps of:
obtaining an iris image of an eye;
segmenting the iris image;
establishing a non-polar coordinate system on the segmented iris image wherein the non-polar coordinate system models dilation of a pupil of the eye;
defining a plurality of regions within the iris image;
analyzing the iris to generate an iris code; and
comparing the iris code with a previously generated reference iris code within the plurality of regions to determine a measure of similarity between the iris code and the reference iris code.
8. The method of claim 7, wherein the non-polar coordinate system is one of a torroidal coordinate system, and a non-classical complex coordinate system.
9. The method of claim 8, wherein the step establishing a non-polar coordinate system further comprises steps of:
applying a coordinate transformation to the segmented iris image to produce modified segmented iris image in rectangular coordinates; and
applying an additional transformation to establish the non-polar coordinate system.
US12/770,118 2005-08-25 2010-04-29 Methods and systems for biometric identification Abandoned US20100266165A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/770,118 US20100266165A1 (en) 2005-08-25 2010-04-29 Methods and systems for biometric identification

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US71110505P 2005-08-25 2005-08-25
US71110605P 2005-08-25 2005-08-25
US71110705P 2005-08-25 2005-08-25
US11/510,197 US7751598B2 (en) 2005-08-25 2006-08-25 Methods and systems for biometric identification
US12/770,118 US20100266165A1 (en) 2005-08-25 2010-04-29 Methods and systems for biometric identification

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/510,197 Continuation US7751598B2 (en) 2005-08-25 2006-08-25 Methods and systems for biometric identification

Publications (1)

Publication Number Publication Date
US20100266165A1 true US20100266165A1 (en) 2010-10-21

Family

ID=37772508

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/510,197 Active 2026-12-16 US7751598B2 (en) 2005-08-25 2006-08-25 Methods and systems for biometric identification
US12/770,118 Abandoned US20100266165A1 (en) 2005-08-25 2010-04-29 Methods and systems for biometric identification

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/510,197 Active 2026-12-16 US7751598B2 (en) 2005-08-25 2006-08-25 Methods and systems for biometric identification

Country Status (2)

Country Link
US (2) US7751598B2 (en)
WO (1) WO2007025258A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8824779B1 (en) * 2011-12-20 2014-09-02 Christopher Charles Smyth Apparatus and method for determining eye gaze from stereo-optic views
US9613281B2 (en) 2005-11-11 2017-04-04 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
RU2630742C1 (en) * 2016-07-18 2017-09-12 Самсунг Электроникс Ко., Лтд. Method, system and device for biometric iris recognition
CN107683446A (en) * 2015-03-13 2018-02-09 感官运动仪器创新传感器有限公司 Method and eye tracks equipment at least one user of automatic identification eye tracks equipment
US10445574B2 (en) 2016-07-18 2019-10-15 Samsung Electronics Co., Ltd. Method and apparatus for iris recognition

Families Citing this family (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8090157B2 (en) 2005-01-26 2012-01-03 Honeywell International Inc. Approaches and apparatus for eye detection in a digital image
US7593550B2 (en) 2005-01-26 2009-09-22 Honeywell International Inc. Distance iris recognition
US7761453B2 (en) * 2005-01-26 2010-07-20 Honeywell International Inc. Method and system for indexing and searching an iris image database
US8442276B2 (en) 2006-03-03 2013-05-14 Honeywell International Inc. Invariant radial iris segmentation
US8705808B2 (en) * 2003-09-05 2014-04-22 Honeywell International Inc. Combined face and iris recognition system
US8098901B2 (en) 2005-01-26 2012-01-17 Honeywell International Inc. Standoff iris recognition system
US8064647B2 (en) 2006-03-03 2011-11-22 Honeywell International Inc. System for iris detection tracking and recognition at a distance
US7953983B2 (en) 2005-03-08 2011-05-31 Microsoft Corporation Image or pictographic based computer login systems and methods
GB2450024B (en) * 2006-03-03 2011-07-27 Honeywell Int Inc Modular biometrics collection system architecture
KR101299074B1 (en) 2006-03-03 2013-08-30 허니웰 인터내셔널 인코포레이티드 Iris encoding system
WO2007101275A1 (en) * 2006-03-03 2007-09-07 Honeywell International, Inc. Camera with auto-focus capability
AU2007220010B2 (en) 2006-03-03 2011-02-17 Gentex Corporation Single lens splitter camera
KR101308368B1 (en) 2006-03-03 2013-09-16 허니웰 인터내셔널 인코포레이티드 An iris recognition system having image quality metrics
WO2008039252A2 (en) * 2006-05-15 2008-04-03 Retica Systems, Inc. Multimodal ocular biometric system
EP2062197A4 (en) * 2006-09-15 2010-10-06 Retica Systems Inc Long distance multimodal biometric system and method
US8170293B2 (en) 2006-09-15 2012-05-01 Identix Incorporated Multimodal ocular biometric system and methods
US8121356B2 (en) 2006-09-15 2012-02-21 Identix Incorporated Long distance multimodal biometric system and method
US7970179B2 (en) 2006-09-25 2011-06-28 Identix Incorporated Iris data extraction
US8063889B2 (en) 2007-04-25 2011-11-22 Honeywell International Inc. Biometric data collection system
TW200907827A (en) * 2007-08-08 2009-02-16 Acer Inc System and method for performing objects with bio-characteristics recognition
US8260009B2 (en) * 2007-08-15 2012-09-04 Indiana University Research And Technology Corp. System and method for measuring clarity of images used in an iris recognition system
US20100202669A1 (en) * 2007-09-24 2010-08-12 University Of Notre Dame Du Lac Iris recognition using consistency information
US20090092283A1 (en) * 2007-10-09 2009-04-09 Honeywell International Inc. Surveillance and monitoring system
US8126858B1 (en) 2008-01-23 2012-02-28 A9.Com, Inc. System and method for delivering content to a communication device in a content delivery system
US8189879B2 (en) * 2008-02-14 2012-05-29 Iristrac, Llc System and method for animal identification using IRIS images
US8024775B2 (en) * 2008-02-20 2011-09-20 Microsoft Corporation Sketch-based password authentication
US8436907B2 (en) 2008-05-09 2013-05-07 Honeywell International Inc. Heterogeneous video capturing system
US8090246B2 (en) 2008-08-08 2012-01-03 Honeywell International Inc. Image acquisition system
CN101727574A (en) * 2008-10-17 2010-06-09 深圳富泰宏精密工业有限公司 Iris recognition system and method
EP2187338A1 (en) * 2008-11-13 2010-05-19 Berner Fachhochschule, Technik und Informatik (TI) Biometric pseudonyms of a fixed-sized template
US8280119B2 (en) * 2008-12-05 2012-10-02 Honeywell International Inc. Iris recognition system using quality metrics
US8768014B2 (en) 2009-01-14 2014-07-01 Indiana University Research And Technology Corp. System and method for identifying a person with reference to a sclera image
US8630464B2 (en) * 2009-06-15 2014-01-14 Honeywell International Inc. Adaptive iris matching using database indexing
US8472681B2 (en) 2009-06-15 2013-06-25 Honeywell International Inc. Iris and ocular recognition system using trace transforms
US8458485B2 (en) 2009-06-17 2013-06-04 Microsoft Corporation Image-based unlock functionality on a computing device
FR2947136B1 (en) * 2009-06-22 2011-07-15 Groupe Ecoles Telecomm METHOD FOR VERIFYING THE IDENTITY OF AN INDIVIDUAL
US8577094B2 (en) * 2010-04-09 2013-11-05 Donald Martin Monro Image template masking
KR101046459B1 (en) * 2010-05-13 2011-07-04 아이리텍 잉크 An iris recognition apparatus and a method using multiple iris templates
US8742887B2 (en) 2010-09-03 2014-06-03 Honeywell International Inc. Biometric visitor check system
US8463036B1 (en) 2010-09-30 2013-06-11 A9.Com, Inc. Shape-based search of a collection of content
US8422782B1 (en) 2010-09-30 2013-04-16 A9.Com, Inc. Contour detection and image classification
US8447107B1 (en) * 2010-09-30 2013-05-21 A9.Com, Inc. Processing and comparing images
US8990199B1 (en) 2010-09-30 2015-03-24 Amazon Technologies, Inc. Content search with category-aware visual similarity
US8254768B2 (en) * 2010-12-22 2012-08-28 Michael Braithwaite System and method for illuminating and imaging the iris of a person
US8831416B2 (en) * 2010-12-22 2014-09-09 Michael Braithwaite System and method for illuminating and identifying a person
AU2011202415B1 (en) 2011-05-24 2012-04-12 Microsoft Technology Licensing, Llc Picture gesture authentication
US8995729B2 (en) * 2011-08-30 2015-03-31 The Mitre Corporation Accelerated comparison using scores from coarse and fine matching processes
US8798332B2 (en) 2012-05-15 2014-08-05 Google Inc. Contact lenses
US9501718B1 (en) * 2013-01-15 2016-11-22 Marvell International Ltd. Image-based control of lighting systems
US10025982B2 (en) 2013-10-08 2018-07-17 Princeton Identity, Inc. Collecting and targeting marketing data and information based upon iris identification
US10042994B2 (en) 2013-10-08 2018-08-07 Princeton Identity, Inc. Validation of the right to access an object
US10038691B2 (en) 2013-10-08 2018-07-31 Princeton Identity, Inc. Authorization of a financial transaction
JP6557222B2 (en) 2013-10-08 2019-08-07 プリンストン アイデンティティー インク Iris biometric recognition module and access control assembly
KR20170046108A (en) * 2014-05-09 2017-04-28 아이플루언스, 인크. Systems and methods for using eye signals with secure mobile communications
US10425814B2 (en) 2014-09-24 2019-09-24 Princeton Identity, Inc. Control of wireless communication device capability in a mobile device with a biometric key
JP2018506872A (en) 2014-12-03 2018-03-08 プリンストン・アイデンティティー・インコーポレーテッド System and method for mobile device biometric add-on
US9836663B2 (en) * 2015-03-05 2017-12-05 Samsung Electronics Co., Ltd. User authenticating method and head mounted device supporting the same
CN105631394B (en) * 2015-05-29 2019-08-02 宇龙计算机通信科技(深圳)有限公司 Iris information acquisition method, iris information acquisition device and terminal
KR20230150397A (en) 2015-08-21 2023-10-30 매직 립, 인코포레이티드 Eyelid shape estimation using eye pose measurement
CN105069446B (en) * 2015-09-23 2019-08-02 宇龙计算机通信科技(深圳)有限公司 Iris recognition method, iris authentication device and terminal
EP3761232A1 (en) 2015-10-16 2021-01-06 Magic Leap, Inc. Eye pose identification using eye features
KR20180102637A (en) 2016-01-12 2018-09-17 프린스톤 아이덴티티, 인크. Systems and methods of biometric analysis
WO2017172695A1 (en) 2016-03-31 2017-10-05 Princeton Identity, Inc. Systems and methods of biometric anaysis with adaptive trigger
US10366296B2 (en) 2016-03-31 2019-07-30 Princeton Identity, Inc. Biometric enrollment systems and methods
US10607096B2 (en) * 2017-04-04 2020-03-31 Princeton Identity, Inc. Z-dimension user feedback biometric system
KR102573482B1 (en) 2017-07-26 2023-08-31 프린스톤 아이덴티티, 인크. Biometric security system and method
CN107480608A (en) * 2017-07-29 2017-12-15 广东欧珀移动通信有限公司 Anti-fake processing method and related product

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5291560A (en) * 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
US20020118864A1 (en) * 2001-02-28 2002-08-29 Kenji Kondo Personal authentication method and device
US20020154794A1 (en) * 2001-03-06 2002-10-24 Seong-Won Cho Non-contact type human iris recognition method for correcting a rotated iris image
US6714665B1 (en) * 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
US6944318B1 (en) * 1999-01-15 2005-09-13 Citicorp Development Center, Inc. Fast matching systems and methods for personal identification
US7693307B2 (en) * 2003-12-18 2010-04-06 Sagem Defense Securite Method and apparatus for iris recognition

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3436293B2 (en) * 1996-07-25 2003-08-11 沖電気工業株式会社 Animal individual identification device and individual identification system
US7277561B2 (en) * 2000-10-07 2007-10-02 Qritek Co., Ltd. Iris identification
EP1671258A4 (en) * 2003-09-04 2008-03-19 Sarnoff Corp Method and apparatus for performing iris recognition from an image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5291560A (en) * 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
US6714665B1 (en) * 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
US6944318B1 (en) * 1999-01-15 2005-09-13 Citicorp Development Center, Inc. Fast matching systems and methods for personal identification
US20020118864A1 (en) * 2001-02-28 2002-08-29 Kenji Kondo Personal authentication method and device
US20020154794A1 (en) * 2001-03-06 2002-10-24 Seong-Won Cho Non-contact type human iris recognition method for correcting a rotated iris image
US7693307B2 (en) * 2003-12-18 2010-04-06 Sagem Defense Securite Method and apparatus for iris recognition

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9613281B2 (en) 2005-11-11 2017-04-04 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
US9792499B2 (en) 2005-11-11 2017-10-17 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
US10102427B2 (en) 2005-11-11 2018-10-16 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
US8824779B1 (en) * 2011-12-20 2014-09-02 Christopher Charles Smyth Apparatus and method for determining eye gaze from stereo-optic views
CN107683446A (en) * 2015-03-13 2018-02-09 感官运动仪器创新传感器有限公司 Method and eye tracks equipment at least one user of automatic identification eye tracks equipment
US11003245B2 (en) 2015-03-13 2021-05-11 Apple Inc. Method for automatically identifying at least one user of an eye tracking device and eye tracking device
RU2630742C1 (en) * 2016-07-18 2017-09-12 Самсунг Электроникс Ко., Лтд. Method, system and device for biometric iris recognition
US10445574B2 (en) 2016-07-18 2019-10-15 Samsung Electronics Co., Ltd. Method and apparatus for iris recognition

Also Published As

Publication number Publication date
WO2007025258A3 (en) 2008-11-20
US7751598B2 (en) 2010-07-06
US20070047772A1 (en) 2007-03-01
WO2007025258A2 (en) 2007-03-01

Similar Documents

Publication Publication Date Title
US7751598B2 (en) Methods and systems for biometric identification
Thornton et al. A Bayesian approach to deformed pattern matching of iris images
Gálvez-López et al. Bags of binary words for fast place recognition in image sequences
Alonso-Fernandez et al. Iris recognition based on sift features
US8483489B2 (en) Edge based template matching
US9361523B1 (en) Video content-based retrieval
Sangineto Pose and expression independent facial landmark localization using dense-SURF and the Hausdorff distance
US20070288452A1 (en) System and Method for Rapidly Searching a Database
Gupta et al. Fingerprint indexing schemes–a survey
Wang et al. Learning compact binary codes for hash-based fingerprint indexing
US9269023B2 (en) Edge based location feature index matching
Le et al. Document retrieval based on logo spotting using key-point matching
Du et al. Discriminative hash tracking with group sparsity
Boddeti Advances in correlation filters: vector features, structured prediction and shape alignment
Hassan et al. An information-theoretic measure for face recognition: Comparison with structural similarity
EP1828959A1 (en) Face recognition using features along iso-radius contours
Kisku et al. Feature level fusion of biometrics cues: Human identification with Doddington’s Caricature
Jayaraman et al. Boosted geometric hashing based indexing technique for finger-knuckle-print database
Zhao et al. Person re-identification with effectively designed parts
Indrawan et al. On analyzing of fingerprint direct-access strategies
Connaughton et al. Fusion of Face and Iris Biometrics from a Stand-Off Video Sensor.
Mehrotra et al. An efficient dual stage approach for iris feature extraction using interest point pairing
Bouamra et al. Off-line signature verification using multidirectional run-length features
Xu et al. High resolution fingerprint retrieval based on pore indexing and graph comparison
Reyhanian et al. Weighted Vote Fusion in prototype random subspace for thermal to visible face recognition

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION