High-performance eye tracking systems often use active illumination of a subject's eyes. This typically means that one or more light sources emit near-infrared light and one or more cameras capture images of the illuminated eyes. In the captured images, various features are extracted, such as the specular reflections of the illumination light source on the cornea (so-called glints) and/or the position of the pupil relative to the position of the eye.
There are two principal illumination strategies, which differ in how the pupil appears in the captured images. If the light source is located close to the camera, in the resulting image the pupil will typically be represented by image elements associated with a relatively bright signal level compared to the surrounding iris. If, instead, the light source is located further away from the camera, the pupil will normally appear comparatively dark in the captured images. This is referred to as bright pupil (BP) imaging and dark pupil (DP) imaging respectively.
For some people, the contrast between iris and pupil, and hence the stability of the eye tracking algorithms, may be optimal in BP mode and for other people, that lack a distinct bright-pupil effect, DP images will have the best contrast. Since an automatic eye/gaze tracker cannot know beforehand whether a particular subject belongs to the former or latter category, it is an advantage if the system is capable of capturing both BD and DP images. This, in turn, however, requires either two cameras or the capture of two images which are separated in time. The former alternative is associated with costs and technical complexity, whereas the latter suffers from motion-related errors.
Further, in some implementations, it may be advantageous to use more than one light source being physically separated from one another to obtain two or more glint signals on the image of the eye. Namely, this, in combination with the known geometry of how the camera and light sources are arranged in the eye/gaze tracker components, can be used to calculate both the radius of the cornea and the distance to the eye. However, simultaneous illumination from two or more light sources is problematic because it may result in a hybrid DP/BP effect in the registered images. This causes a poor pupil/iris contrast. Moreover, if the glints are located too close to one another with respect to the available resolution (due to limited optical performance, the size of the pixels or a combination thereof), the glints will blend into each other. This causes inaccuracy when determining the center-of-glint-positions. Additionally, such intermixed glints may obscure the pupil edge, which must be visible for accurate eye tracking algorithms.
U.S. Pat. No. 6,959,102 describes a method for increasing the signal-to-noise ratio in infrared-based eye gaze trackers used in the presence of ambient light, such as sunlight. Here, an infrared illuminator is modulated in synchronicity with a camera, such that for example, a first frame contains both the illuminator signal and the ambient radiation information, and a second frame contains only the ambient radiation information. Thus, given that the variation in the ambient radiation is slow, it is possible to subtract the ambient radiation information from the image data of the first frame, and thereby the eye/gaze tracking can be improved.
EP 2 519 001 discloses another solution for determining the ambient radiation contribution to a scene. Here, a stream of light pulses periodically illuminates the scene. Light reflected from the scene is registered in such a manner that during a light pulse electric charges originating from the incoming light are stored in a first storage element of a pixel in an image sensor; and between the light pulses, electric charges originating from the incoming light are stored in a second storage element of the same pixel.
Consequently, there are solutions for reducing the influence from ambient light, e.g. the sun, which may generate patterns and/or reflections disturbing the eye tracking algorithms and deteriorating the eye tracker performance. However, as discussed above, several additional problems must also be addressed to improve the tracking robustness of the eye/gaze tracker.