The use of gaze tracking to control the selection of options on a display device, or the identification of objects of interest in a displayed image, continues to increase, particularly in view of the availability of low-cost gaze tracking devices and software, such as offered by Tobii Technology, Inc. and The Eye Tribe.
Gaze tracking may also be used in a ‘passive’ mode, wherein a user's gazing pattern is monitored to determine the user's reaction to image composition, such as the arrangement of elements on a web-page, a gaming program, and so on. Passive monitoring may also be used for physiological research, such as determining how different people react to different simulated vehicular situations. In like manner, passive monitoring may be implemented in a vehicle to issue an alert if the driver's gaze is abnormal.
In a conventional gaze tracking device, the angular deviation from ‘straight ahead’ is measured, using optical devices, including cameras and infrared emitters and receivers, or other devices, such as devices that measure biometric parameters. Example gaze tracking devices and their application may be found at USPA 2011/0069277, “VISUAL DISPLAY WITH ILLUMINATORS FOR GAZE TRACKING”, filed Mar. 24, 2011 for Blixt et al.; USPA 2014/0268055, “EYE/GAZE TRACKER AND METHOD OF TRACKING THE POSITION OF AN EYE AND/OR A GAZE POINT OF A SUBJECT”, filed Mar. 7, 2014 for Skogo et al.; USPA 2013/0235347, “SYSTEM AND METHOD FOR INTERACTING WITH AND ANALYZING MEDIA ON A DISPLAY USING EYE GAZE TRACKING”, filed Apr. 25, 2013 for Hennessey et al.; USPA 2009/0268045, “APPARATUS AND METHODS FOR CONFIGURATION AND OPTIMIZATION OF IMAGE SENSORS FOR GAZE TRACKING APPLICATIONS”, filed Aug. 4, 2008 for Sur et al.; USPA 2013/0002846, “SYSTEM AND METHOD FOR TRACKING THE POINT OF GAZE OF AN OBSERVER”, filed Mar. 15, 2011 for Frederick Jan De Bruijn, Karl Catherine Van Bree, and Tommaso Gritti; and USPA 2014/0160005, “APPARATUS AND METHOD FOR CONTROLLING GAZE TRACKING”, filed May 7, 2013 for Lee et al. Each of these cited publications are incorporated by reference herein.
To accurately map a user's gaze to a location on a display screen, gaze tracking devices determine and apply a calibration procedure to compensate for the optical characteristics of each user. This calibration may be active, wherein, for example, the user is directed to gaze at particular points on the display screen, or passive, wherein, for example, the device provides stimuli from a known location while the user is viewing the display device, and detects the angle at which the stimuli strikes the user's eye(s).
FIG. 1 illustrates an example illumination pattern 100 that may be used during an active calibration, such as used in a device provided by The Eye Tribe. The pattern comprises a dark background 110 with nine selectable illumination points 120, or ‘targets’. The calibration element of the device selectively illuminates one of the illumination points and detects the angle(s) of the user's eye(s) as the user gazes at the illuminated point. The example calibration element determines, based on the location of the user's eyes relative to the display device and 3-D geometry, the ‘true’ angles of each illumination point from the user's eyes. The difference between each measured (‘actual’) angle of the user's eyes when observing the illumination point and the ‘true’ angle of the illumination point from the user's eye defines the ‘error’ factor for this particular user.
Any of a variety of techniques may be used to determine a correction function to be applied to the ‘actual’ angles that will minimize the errors between the ‘corrected’ angle and the ‘true’ angle. In an example embodiment, a linear correction function that minimizes the sum of the squares of the errors is determined that includes an offset and a scaling factor that is applied to the ‘actual’ angles. Non-linear correction functions may also be used, such as a second order function that minimizes the errors.
A limitation of the use of a correction function is that it assumes a certain ‘uniformity’ in the errors that can be modeled with a single correction function. That is, for example, using a linear correction function assumes that the errors exhibit a linear variation across the display; a second order correction function assumes that the errors exhibit a pattern that corresponds to a second order (quadratic) effect.
Conventional correction functions are generally included within the gaze tracking software provided by the provider/developer, because the developer is aware of the algorithms used to determine the true angle and the measured angle, and is best able to determine and compensate for the errors introduced by the particular gaze tracking technique employed. Additionally, in the example of a gaze tracking device used to determine where, on a displayed image, the user is gazing, the angle-correction function must be applied before the coordinates of the gaze point on the display are determined.
Developers of applications that use the results provided by a gaze tracking device often desire to make their applications compatible with a variety of gaze tracking devices, to give their applications a broader market appeal. Preferably, these applications should provide the same functional utility regardless of the characteristics of the different gaze tracking devices. In like manner, the application should provide the same functional utility regardless of the size of the display being used (within practical limits).
If different gaze tracking devices exhibit different levels of accuracy or precision, the application developer must either design the application to be compatible with the device having the poorest accuracy, which may limit the functionality of the application, or must work with the provider of the poorest gaze tracking device to enhance the inherent accuracy of this gaze tracking device.
In addition to differing accuracy characteristics, different gaze tracking devices will also exhibit differences in other performance factors, such as ‘noise’ or ‘jitter’ caused by factors such as actual minute movements of the eye while gazing at a target, variances inherent in the algorithms used to compute the gaze angle, variances in the signals provided by the sensors used to detect the gaze angle, and so on. Depending upon the particular gaze tracking device, the device may provide ‘raw’ data corresponding to each determination of each coordinate on the display, or ‘smoothed’ data corresponding, for example, to a running average of the determined coordinates. A stream of raw data necessitates a smoothing algorithm in the application, whereas smoothed data introduces a lag in the responsiveness of the gaze tracking device.