1. Field of the Invention
The present invention generally relates to a system for scrolling a screen and, more particularly, to a system for scrolling an image on a computer display screen based on eye movements.
2. Description of the Related Art
Eye tracking systems can determine where on a computer screen the user is looking and they have been used as the primary input device of computers, replacing both the keyboard and the mouse. Thus, such systems are able to control all processes of a computer that do not require response times that are too short to be provided via eye movements and software constraints. Since these systems do not require any force or the use of any movements other than the eyes, they are appealing for use by people who are unable to use conventional devices.
While eye trackers can enable human-computer interaction using only the eyes, there are many reasons that such interaction can be difficult, frustrating, and tiresome. First, people do not normally consciously control their eye movements. Furthermore, people are not accustomed to changing their visual environment simply by looking at it. Another reason is that the accuracy of eye position information is limited to the angle of foveation, which is about 1 degree. This angle describes the portion of the visual field that falls entirely on the high resolution part of the retina, called the fovea. Since everything projected onto the fovea is high resolution, it is unnecessary to move the eye to evaluate foveated objects. Consequently, special techniques must be employed to improve the specificity of eye gaze information (such as enlarging an area that has been dwelled upon). Finally, eye gaze information is limited to cursor or screen position, which leads to very cumbersome styles of interaction.
Conversely, most input devices are able to convey both the action (e.g., left/right mouse click) and the object of the action (e.g., the graphical construct underneath the mouse cursor). To distinguish between an action and the object, eye trackers either use a secondary input, such as a sip-and-puff switch, or they distinguish between eye movement and eye dwell. Moving the eye to a particular screen location and dwelling there indicates the object and the desire to perform an action. This can produce a dialog window with buttons indicating the desired action, which can then be selected by looking at them and dwelling upon them. However, this results in an unnatural, slow, and cumbersome means of controlling the computer.
Reading text and navigating through information on computer screens can be difficult even when using conventional input devices and interaction idioms, like dragging the tab on a window scroll bar. This is made even more difficult when the user is unable to use conventional devices and must use alternatives, like eye tracking systems.
The challenge, therefore, is to make reading text and navigating through scrollable information easier and more natural. Scrollable information includes any information that is discretized into individual units and can be displayed sequentially. Examples of navigating through scrollable information include reading text, searching through thumbnail images, perusing the columns or rows of a spreadsheet, etc.