The present invention relates generally to determining the location of an object using a single camera and, more specifically, to determining the location of an object such as a finger or pen relative to a reflective surface such as a display screen.
A variety of interfaces are used to allow users of computing devices to interact, receive and enter information. Traditionally, these interfaces have included keyboards for entering alpha-numeric characters and a pointing device, such as a mouse for example. The movement of the mouse is tracked by a pointer image on the computing device's screen. By moving the pointer with the mouse, the user is able to select objects on the screen, such as icons for example.
More recently, some computing devices have used so-called “touch-screen” pointing devices in place of or in addition to the mouse. A touch screen tracks the location of the user's finger or a stylus when they are placed in close proximity (less than a centimeter) from the display screen. These touch-screen devices are usually layered on top of the computing devices visual display. One type of touch screen measures the change in capacitance that results when an electrostatic field of the screen changes in response to the presence of the user's finger. Other types of touch screen systems determine a location where the user touches the screen based on resistance, infrared grids or piezoelectricity. It should be appreciated that these methods of determining where the user has touched the screen are typically integral to the device. Thus it is difficult to add this functionality to existing computing devices without altering the visual appearance and potentially the usability of the computing device.
Other user-interface systems have been developed that utilize optical imaging to determine the location of the user's finger or a stylus relative to the screen. These systems use either a plurality of cameras in a fixed geometric relationship that acquire images of the user's finger from at least two different positions. These camera pairs are sometimes referred to as stereoscopic cameras. Due to the fixed relationship of the cameras the positioning of the user's finger in the image may be used to determine the position using trigonometric principles. Other systems use a single camera having components capable of using time-of-flight techniques to resolve the distance to an object. In these systems, the distance is determined based on the speed of light and the amount of time it takes for a laser or light pulse to travel to the object and return. It should be appreciated that both the stereoscopic cameras and time-of-flight cameras are relatively specialized devices that need to be acquired by the user and may be costly.
A third type of interface system has been developed based on acoustics. These devices detect the noise generated by the touching (scratching) of the screen and the direction the noise originated from. In some instances the noise sensors are attached to the screen and detect the propagation of the sound waves in the screen substrate. Similar to the capacitance type touch devices, the acoustic systems generally need to be integrated into the device to function as desired.