Interaction with technological devices is becoming an ever-increasing part of everyday life. However, effectiveness and efficiency of such interaction is generally lacking. In particular, when seeking user input, devices such as computers, cellular telephones and personal digital assistants (PDAs) are often disruptive, because such devices cannot assess the user's current interest or focus of attention. More efficient, user-friendly interaction is desirable in interactions with household appliances and electronic equipment, computers, and digital devices.
One way that human-device interactions can be improved is by employing user input such as voice and/or eye contact, movement, or position to allow users to control the device. Many previous attempts relate to controlling computer functions by tracking eye gaze direction. For example, U.S. Pat. No. 6,152,563 to Hutchinson et al. and U.S. Pat. No. 6,204,828 to Amir et al. teach systems for controlling a cursor on a computer screen based on user eye gaze direction. U.S. Pat. Nos. 4,836,670 and 4,973,490 to Hutchinson, U.S. Pat. No. 4,595,990 to Garwin et al., U.S. Pat. No. 6,437,758 to Nielsen et al., and U.S. Pat. No. 6,421,064 and U.S. Patent Application No. 2002/0105482 to Lemelson et al. relate to controlling information transfer, downloading, and scrolling on a computer based on the direction of a user's eye gaze relative to portions of the computer screen. U.S. Pat. No. 6,456,262 to Bell provides an electronic device with a microdisplay in which a displayed image may be selected by gazing upon it. U.S. Patent Application No. 2002/0141614 to Lin teaches enhancing the perceived video quality of the portion of a computer display corresponding to a user's gaze.
Use of eye and/or voice information for interaction with devices other than computers is less common. U.S. Pat. No. 6,282,553 teaches activation of a keypad for a security system, also using an eye tracker. Other systems employ detection of direct eye contact. For example, U.S. Pat. No. 4,169,663 to Murr describes an eye attention monitor which provides information simply relating to whether or not a user is looking at a target area, and U.S. Pat. No. 6,397,137 to Alpert et al. relates to a system for selecting left or right side-view mirrors of a vehicle for adjustment based on which mirror the operator is viewing. U.S. Pat. No. 6,393,136 to Amir et al. teaches an eye contact sensor for determining whether a user is looking at a target area, and using the determination of eye contact to control a device. The Amir et al. patent suggests that eye contact information can be used together with voice information, to disambiguate voice commands when more than one voice-activated devices are present.
While it is evident that considerable effort has been directed to improving user-initiated communications, little work has been done to improve device-initiated interactions or communications.