(i) Technical Field of Invention
The invention is a form of oculometer and as such can be used to measure eye gaze and fixation duration as well as the dual eye binocular convergence point, and as such has potential applications in the medical, scientific, engineering, manufacturing, military, and entertainment fields. The apparatus may be used as a tool for medical diagnosis of ocular functions, as an aid to the paraplegic or handicapped, for the measurement of ocular functions and workload in human factors studies, as a measure of subject training, as a tool for fatigue monitoring, as part of an electronic safety net to detect performance degradation due to pilot incapacitation in piloted and teleoperated vehicles, as a component of an electronic intelligent pilot-vehicle interface used for situation awareness aiding in piloted and teleoperated vehicles, for task scan analysis including measuring situation awareness, for human operator control of machines and computer games, and for advertisement and usability analysis.
The invention is used with head-mounted video displays such as those developed for virtual reality, stereographic displays, monocular or binocular vision helmet mounted displays, and night vision goggles used in piloted helicopters, vehicles, and teleoperated robotics control stations. The invention can be used as an eyetracker to control computerized machines from an electronic video display by the ocular gaze point of regard and fixation duration.
Examples of machine control by ocular functions include updating computer generated information displays, selecting panel switches and instruments, controlling the fidelity of computer generated imagery scene inserts in simulations, controlling the viewing direction of remotely located cameras, controlling the movement of teleoperated robotics platforms or vehicles, selecting display subareas for automated scene analysis in aided target recognition, designating targets from direct sight or from a sensor display, and weapon system pointing.
The invention has particular applications to time shared concurrent tasks where the hands are involved in a continual time critical pilotage task and the eyes may be used intermittently to control a discrete task. The use of this invention enables both tasks to share a common visual working area with overlaid visual images. In this way, task interference is reduced by dedicating eye-movements and visual attention to the same working surface.
An example of such an application would be single pilot nap-of-earth low-level helicopter flight while updating onboard heads-up displays. A similar application is teleoperations of remote vehicles from video displays with camera control. Another such application is to the operation of completely enclosed armored vehicles with "transparent" or "see through" armor where the operator sees a video projection of the outside scene as recorded by externally mounted cameras and relayed to internal monitors; the operator would use the invention to control displays overlaid on the scene projection while concurrently performing the vehicle pilotage task. Similar comments apply to the piloting of "glass cockpit" designs for completely enclosed high performance aircraft.
(ii) Description of the Prior Art
The existing technology for oculometers is based upon the optical measurement of reflected infrared light from the human eye, and is more than thirty years old in concept. In its present form, an oculometer contains a single infrared light source which is directed at the eye and the reflected light is imaged onto a charge-injection (CID) or charge-coupled device (CCD) sensor array. The image of the eye is then electronically processed to determine the corneal reflection, the pupil centroid orientation, or both. These parameters are used to determine the angular location of the eye relative to the camera within a fair degree of accuracy. The technology is either head-mounted or mounted in a panel in front of the user.
The head mounted systems are awkward to wear. The accompanying optics are bulky and heavy, limited in field of view, and induce neck and shoulder muscle fatigue. The infrared light source is placed next to the eye; a filter shields the source, eye, and sensor from the ambient light. The arrangement limits and interfere with the visual field. The boresight established in calibration, is easily perturbed by head movement or vibration induced disturbances in the head support system. In addition, adjustment difficulties may occur with individuals having a large nose bridge, deep lying eyes, bulging eyes, jutting eye lids, or other extreme deviations in facial structure.
These limitations become apparent when integrating the present technology with helmet mounted displays. Recent advances in display technology have produced extremely light weight helmet mounted displays comparable in size to bulky sunglasses. The eye relief distance is specified in millimeters and there is little room between the display and the eye for placing the infrared sources and fiber optics probes used with the present technology.
The optics for the panel mounted system are mounted in front of the user and directed toward his face. The panel mounted system is limited to low ambient light levels and objects that the user may need to work with can not be placed between his face and the optics. A servomechanism is used to keep the optics aligned on the user's eye, and the servomechanism adjustment is noticeable to users following head movement. Excessive head movements and interference of the optical path by facial features such as the user's nose, are not tolerated.
The oculometer determines the angular location of the eye relative to the sensor array within a fair degree of accuracy. The measurement of head position and orientation for the head-mounted system allows the determination of eye position and orientation in the workspace, and therefore the computation of eye point of regard. Similarly, the determination of the range from the sensor to the eye by either ultrasonic or image focusing allows the computation of eye point of regard for the panel system. The accuracy of the technology is roughly about +/- one degree in practice; a lower limit appears to be about +/- 0.75 degree.
The oculometer is used to determine the location of the user's visual attention; it can be used to select display icons in machine control. The user may be looking over a display or scene to acquire task related information in preparation to initiating machine control. Some other means is usually required for the subject to indicate that a particular display element or scene feature has been selected by him for machine action. For example, the user may perform a motor function in conjunction with a forced gaze at a selected item, by pressing a switch or speaking a voice command to an automatic speech recognizer.
Exemplary of existing technology include U.S. Pat. No. 3,986,030 to Teltscher which discloses eye-motion operable keyboard-accessory includes light source means for directing a light ray at the eye of an operator, for obtaining a light reflection there from, and a plurality of light-responsive sensors selectively actuable by the operator reflected light ray. U.S. Pat. No. 4,109,145 to Graf discloses an apparatus that is controlled by movement of the eye which using a line of sight detecting apparatus such as an oculometer produces signals indicative of the direction along which an operator is looking and positions a display for viewing by the operator. U.S. Pat. No. 4,209,255 to Heynau et al. disclose single source aiming point locator using a single source light emitting diode (LED) mounted on the helmet of a pilot in an aircraft, which can be used to designate a point on a display from photodiodes located adjacent to the display. U.S. Pat. No. 4,623,230 to Weinblatt discloses media survey apparatus and method using thermal imagery to monitor visual observation of a transitionally viewed display by directing infrared radiation from behind the display into the path of travel of a moving observer and detecting the retinal eye reflections from the observer when he is viewing the display. U.S. Pat. No. 4,648,052 to Friedman et al. discloses an eye-tracker communication system based on a system for computer vision comprising a source of digitized video signals and a frame encoder circuit which encodes threshold crossing events in eye fixations on a two dimensional area within the frame and stores the results in a cache memory. U.S. Pat. No. 4,761,071 to Baron discloses an apparatus and method for determining the corneal and scleral topography by a plurality of triangulations each made to determine the elevation of a different location on the surface of the eye from a positive sensitive detector of known orientation and an incident narrow beam of light with known position and direction, which illuminates a fluorescent substance instilled on the surface of the eye. U.S. Pat. No. 4,973,149 to Hutchinson discloses an eye movement detector utilizing an infrared light emitting diode mounted coaxially in front of the lens of an infrared sensitive video camera for remotely making images of the eye of a computer operator. U.S. Pat. No. 5,035,500 to Rorabaugh et al. disclose an automated ocular perimetry, particularly kinetic perimetry, which using a single visual light source moved with a x-y plotter under computer control to discrete positions within the visual field of a subject, has the subject indicate detection of other light sources positioned in a regular array and momentarily illuminated at various times, while the subject is fixating the first light source. U.S. Pat. No. 5,170,153 to Migozzi et al. disclose an optical device for the display of light data collimated to infinity which uses an oculometer to determine the pupil center of the eye and an automatic control circuit to adjust the position of the display image so that it is centered over the pupil. U.S. Pat. No. 5,220,361 to Lehmer et al. disclose gaze tracking for field analyzer through automated video surveillance of the patient's eye from an infrared source mounted in the center of the test screen to the front of the patient, and the video measurement of corneal reflected light and the computation of the pupil chord to determine the gaze direction. Adams, C. (1990). "If looks could kill: the eyes have it". Military & Aerospace Electronics. (Mar) 35-37. Reviews infrared light source oculometers and the potential for use by pilots of military aircraft in designating targets by eye gaze. Anliker, J. (1976). Eye movements: on-line measurements, analysis, and control. In R. A. Monty & J. W. Senders (Eds.), Eye Movements and Psychological Processes. (185-199), Hillsdale, N.J.: Lawrence Erlbaum Associates. Describes automatic fixation and saccade detector classification schemes. Calhoun, G., Janson, W., & Arbak, C. (1986). "Use of eye control to select switches". Proceedings of the Human Factors Society 30th Annual Meeting, Dayton, Ohio. Describes the experimental results and limitations of pilots using a head mounted infrared light source oculometer for selection of panel switches in a fighter aircraft simulator. Goode, P. W. (1984). "Eye Directed View". 21st Space Conference, Kennedy Space Center. Describes experimental results of using an infrared light source oculometer for control of remote camera viewing direction. Jacob, R. J. K. (1989). "What you see is what you get: the use of eye movements in human-computer interaction techniques". Washington, D.C.: Naval Research Laboratory. Describes techniques and limitations of using a panel mounted infrared light source oculometer for control of computer driven displays by eye fixation and gaze duration. Smyth, C. C. & Dominessy, M. E. (1989). Comparison of oculometer and head-fixed reticle with voice or switch and touch panel for data entry on a generic tactical air combat display (Technical Memorandum 21-89). Aberdeen Proving Ground, Md.: U.S. Army Human Engineering Laboratory. Describes the experimental results and limitations of using a head mounted infrared light source oculometer for control of computer driven displays by eye fixation and switch or voice commands.