Eye tracking systems have been developed to interact with electronic devices such as personal computers, automobile control systems, and other automation systems. In general, such systems allow team members to issue commands to electronic devices by gazing at a determined area of a monitor or display. Use of eye tracking systems capable of activating and operating electronic devices reduces the amount of verbal communication or gestures that are required to convey a desired message or a desired command to the selected electronic device.
Initially, such eye tracking systems have found use as interfaces to devices to be operated or used by handicapped or otherwise impaired persons, although the technology has gradually spread to other applications, where it still offers some limited command capabilities.
Recently, in addition to eye tracking systems, interfaces capable of recognizing hand gestures hove became available, for example, Microsoft Kinect system and others. Methods and system combining different types of non-contact man-machine interfaces are fast developing and are disclosed for example in the EyeMax+ from DynaVox Mayer-Johnson, combining speech and eye inputs to allow disabled persons to control a computer.
Such non-contact man-machine interfaces could be desirable in the modem operating room environment, where relatively large teams of different medical profession specialists perform a surgical or diagnostic procedure using a relatively large number of instruments or machines. These interfaces could reduce the amount of verbal and prone to misinterpretation communication, their protocol could be set according to certain predefined protocol setting either according to team hierarchy or procedure sequence. Such interfaces could replace the existing system operating buttons, keyboards or touch type monitors used by some operating team members to perform different procedures and protocols. Reduction in contact type operated processes and verbal communication could preserve sterility in the operating room.