A. K. Jain, A. Ross, and S. Prabhakar, “An introduction to biometric recognition,” IEEE Trans. Circuits Syst. Video Technol., vol. 14, 2004 discloses that the iris of the eye is a near-ideal biometric. Typically an image of an iris is best acquired in a dedicated imaging system that uses infra-red (IR) illumination, typically near infra-red (NIR) in the range 700-900 nm, and the eye is aligned with the acquisition camera. Nonetheless, systems supporting the acquisition of iris data from mobile persons are known, for example, as disclosed in J. R. Matey, O. Naroditsky, K. Hanna, R. Kolczynski, D. J. Lolacono, S. Mangru, M. Tinker, T. M. Zappia, and W. Y. Zhao, “Iris on the Move: Acquisition of Images for Iris Recognition in Less Constrained Environments,” Proc. IEEE, vol. 94, 2006. This employs specialized lighting and requires people to walk along a specified path where multiple successive iris images are acquired under controlled lighting conditions. The system is proposed for airports where iris information is being used increasingly to verify passenger identity.
Daugman, J. “How Iris Recognition Works”, IEEE Transactions on Circuits and Systems for Video Technology, 14(1), 21-30, 2004 discloses that in order to correctly identify an iris, an iris pattern needs to extend across at least 80 pixels of an image. Other studies indicate that an iris pattern might need to extend across up to 200 pixels of an image for reliable iris identification.
For a typical 8 megapixel (Mp) camera with a field of view (FoV) of 70 degrees, assuming a wavelength of 550 nm (near the center of the visible spectrum) and a subject face at a distance of approximately 250 mm, it can be shown that a typical iris of 12 mm diameter would extend across the minimum resolution of 80 pixels and so might meet the required criteria for identification. However, when the same calculation is applied at a NIR wavelength of 900 nm, it can be seen that the resolution is reduced to about 50 pixels. So while a typical image sensor of 8 Mp might resolve an iris at visible wavelength, the sensor would not be optimal at NIR wavelengths.
These calculations suggest that implementing iris recognition in a consumer device would require either a movable NIR filter to be added to an imaging system which is optimized for both visible and NIR image capture, or alternatively a dedicated imaging system for iris-image acquisition to be provided. Clearly, these solutions would add to the cost of such a device, making implementation prohibitive, especially in consumer devices.
As will be understood, the acquisition challenges to implementing iris imaging, especially for the purposes of identification, on consumer devices such as smartphones, include that: (i) the iris is a relatively small target located on a wet, curved and reflecting surface; (ii) occlusions occur due to eyelids and eye-lashes; (iii) specular reflections block portions of the iris pattern; (iv) the iris pattern is most responsive to NIR illumination; (v) visible light can distort the fundamental iris pattern, especially in a dark pigmented iris; and (vi) image acquisition requires high user cooperation etc.
The prior art includes U.S. Pat. No. 6,870,690 which describes a single optical system used to image two different optical bands, for example visible and infrared, with the same possible adjustments in zoom and/or focus. A dual band singlet is formed of a first, larger, optical element, suitable for operating on light of a first optical band, with an aperture cut out of it. A smaller element, suitable for operating on light of a second optical band, is secured in, or on either side of, the aperture cut through the larger optical element, thus forming a dual band singlet that can operate on two different wavelength bands. Combinations of dual band lenses, lens elements, and lenses with cut-out apertures are used to form dual-band optical systems, including systems with dual-band focus and zoom.
U.S. Pat. No. 8,059,344 describes an optical lens system that focuses both a first band of wavelengths and a second band of wavelengths onto a common focal plane. The system uses lens segmentation where a second region made of a second material is different from a first material, and is shaped to partially focus the second band of wavelengths onto the focal plane.
Each of U.S. Pat. No. 6,870,690 and U.S. Pat. No. 8,059,344 are similar in that they involve an annular shaped outer lens region designed to focus a first set of wavelengths and an inner lens region designed to focus a second set of wavelengths. Such a two-component structure involves complex manufacture and assembly and so would be prohibitively expensive to include in consumer electronic devices. It should also be noted that in U.S. Pat. No. 8,059,344, the inner lens does not transmit the wavelength of the outer region so there is a blind spot at the center of the lens.
U.S. Pat. No. 7,881,603 describes a dichroic filter for use with an electronic imaging device, such as a camera. The dichroic filter is located in the main imaging lens, and may permit all light to pass through a first portion and be measured by a photosensor, while restricting at least some portions of visible light from passing through a second portion thereof. In this manner, only the non-restricted portions of visible light passing through the second portion may be measured by the associated pixels of the photosensor.
US 2010/0066854 describes a camera comprising an imaging system having a first depth of field for one or more first colours and a second depth of field, smaller than the first depth of field, for one or more second colours. The imaging system may comprise an iris with a first aperture for the first colour or colours and a second aperture, which is larger than the first, for the second colour or colours. The first aperture may be defined by an outer opaque ring (1) and the second by an inner chromatic ring (2). The inner ring (2) blocks the first colour(s) and passes the second colour(s). The image formed of the first colour(s) is sharper and its sharpness may be transposed by image processing to the other images. Each of U.S. Pat. No. 7,881,603 and US 2010/0066854 however are only concerned with processing visible wavelengths.
In U.S. Pat. No. 8,294,808, a dual field-of-view optical imaging system is provided for obtaining two images of a scene, each image having a different field of view. The dual field-of-view optical imaging system includes a frontal dual focus lens, the dual focus lens having a central zone of focal length f1 producing a wide field-of-view image at a first focal plane and a peripheral zone of focal length f2 greater than f1 producing a narrow field-of-view image at a second focal plane; and a detector for detecting and acquiring the wide field-of-view image and the narrow field-of-view image, the detector being movable along an optical path for selective positioning at the first focal plane or the second focal plane.
US 2013/0033579 discloses processing multi-aperture image data including (i) capturing image data associated of one or more objects by simultaneously exposing an image sensor in an imaging system to spectral energy associated with at least a first part of the electromagnetic spectrum using at least a first aperture, and (ii) to spectral energy associated with at least a second part of the electromagnetic spectrum using at least a second aperture; (iii) generating first image data associated with said first part of the electromagnetic spectrum and second image data associated with said second part of the electro-magnetic spectrum; and, (iv) generating depth information associated with said captured image on the basis of first sharpness information in at least one area of said first image data and second sharpness information in at least one area of said second image data.
US 2013/0113988 discloses forming an image of a scene including capturing an image of the scene by exposing an image sensor to radiation from one part of the EM spectrum using one aperture and to radiation from another part of the EM spectrum using another aperture having a different size than the first aperture. Simultaneously with capturing the image, the scene is illuminated with radiation from the second part of the EM spectrum. The image is then formed on the basis of image data generated by the radiation from the first part of the EM spectrum and image data generated by radiation from the second part of the EM spectrum.
Each of U.S. Pat. No. 8,294,808, US 2013/0033579 and US 2013/0113988 requires that the two different regions of the electromagnetic spectrum are focused to different depths.
US 2012/0242857 discloses a system for increasing depth of field in camera lenses, relying on half aperture filtering. The imaging system has an aperture with a first region arranged to pass at least optical radiation in the first frequency band and substantially block optical radiation in the second frequency band and a second region arranged to pass at least optical radiation in the second frequency band, the first region having its center non-overlapping a center of the second region.
US2011/205651 discloses a photodynamic diagnosis or therapy system including an aperture stop including a filter area formed on a flat-plate substrate, and a variable aperture area formed inside the filter area. The filter area transmits infrared light and reduces or blocks the transmission of visible light. The aperture area transmits light in the wavelength range corresponding to fluorescent light from an observed area of a subject and light in the wavelength range corresponding to illumination light to the subject. The system makes it possible to simultaneously observe a subject image formed by illumination light in the visible light band and an observed image formed by weak fluorescent light from the observed area of the subject in the infrared light band.