The human body is highly reconfigurable courtesy of its many joints that allow motions (generally rotations) in a variety of directions. Many of these motions are under fine, conscious motor control, so they can be used to interface with and control devices and systems ranging from musical instruments to microsurgery actuators.
The head is an extremity with a number of degrees of freedom of motion, and because of its close connection with the operator's visual system, tracking head position offers a wealth of information about the person. For example, head position provides a strong clue as to what the person is looking at. Alternatively, if head position can be tracked, then the tracking information can be communicated to a mechanical system to control various aspects of the system—as many as the degrees of freedom that can be tracked. Thus, a person's head position can be used as a sensitive, multi-modal input device to control a system or operation, leaving the person's hands free for other activities. Thus, for example, head position can be used to control a camera in a remote-view application so that the camera looks in the direction the person turns his head, while his hands can manipulate a keyboard, joystick or other controller. Head position and orientation estimates can be used in a number of other applications as well; these will be discussed presently.
Current head-tracking systems generally rely on direct measurements with accelerometers, gyros and other sensors mounted to the head; or indirect measurements from one or more cameras observing carefully-placed marks on the subject's head. These approaches are effective and accurate, but rely on equipment and procedures that are expensive, cumbersome and/or difficult to calibrate. An alternate approach that uses commodity equipment and automatic calibration to produce fairly accurate head-position estimates can support many applications, and may be of significant value in the field.