A touchscreen is a specially configured display device that is generally capable of detecting when a display panel is touched and the location of touches within the display panel. The touches may be provided by a stylus, human finger, or the like. A touchscreen provides an additional or alternative input device to a keypad, keyboard, or mouse for a wide range of electronic devices, such as point of sale devices, hand-held electronics, interactive displays, work stations, personal computers, automated teller machines, and the like. Historically, these touchscreen systems have employed different types of technologies to sense the location of touches, but have been limited to being able to detect the location of only a single touch at any given time.
In recent years, multi-touch touchscreens have been developed that are capable of sensing multiple touches at the same time. The ability for a touchscreen to detect and track multiple touches represents a revolutionary step forward in interface technology. One can readily imagine the freedom such technology can impart to specialty users, such graphics designers, presenters, photographers, and architects, as well as general computer users. With multi-touch touchscreens, one or more users can use multiple fingers to select, manipulate, and drag displayed objects at the same time. The potential for such applications is vast, especially as the size of the multi-touch touchscreens increases.
The most commercially notable multi-touch touchscreen to date is the primary interface and display on the Apple® iPhone. The iPhone allows a user to use two fingers at the same time to perform various functions through the touchscreen, including enlarging or shrinking an object being displayed by respectively sliding the fingers apart or sliding the fingers toward each other along the surface of the touchscreen. The iPhone touchscreen relies on a capacitive grid to detect the coordinates and movement of the different touches; however, the use of the capacitive grid means that the touchscreen cannot detect the touch of a stylus or human touches through non-conductive gloves. Further, the use of the capacitive grid substantially limits the scalability of the touchscreen and the number touches that can be tracked at any given time. For the iPhone, only two touches are generally used at any given time. As such, only relatively small displays are able to incorporate the iPhone's touchscreen technology. Many other current touchscreen technologies suffer from the same operational and size limitations.
Promising new multi-touch technologies are being developed for large display panels by corporations such as Perceptive Pixel, Inc. (www.perceptivepixel.com) and Microsoft Corporation (www.microsoft.com/surface). These technologies employ rear projection technology to project display content on the back of a large display panel such that the display content is viewable from the front of the display panel. The touch detection relies on sensing infrared (IR) light being emanated or reflected from the back of the display panel in response to the front of the display panel being touched. The IR light is generally only emanated from the spots being touched, and IR sensors are capable of detecting the intensity, location, and any movement associated with the touches. An associated processor receives the corresponding touch information from the IR sensors and processes the touch information as user input. The processor is configured to control the display content based on the user input derived from the display panel being touched.
Although these technologies provide remarkable multi-touch interactivity, the technology is essentially limited to human touch and is generally not capable of tracking contact and movement of smaller, non-human objects, such as a stylus from a pen-type instrument or the like. The effective touch sensitivity of the display panel and the IR sensors is relatively low, thereby limiting the resolution at which a user may operate to the size of their fingers. Accordingly, multi-touch applications are limited to relatively high level selection, movement, and outlining functions. Small scale handwriting, drawing, selection, and the like that would require operating at higher resolutions is not currently feasible. Further, when multiple users are interacting with the display panel, these technologies are not capable of differentiating between different users who are touching the display panel.
Yet a further drawback for these technologies is the impact of ambient light on detecting IR light that is emanated or reflected from the rear of the display panel. Ambient light is often incandescent or natural light, which has a relatively high IR light content. The ambient IR light interferes with the IR light that is emanated or reflected from the rear of the display panel in response to a touch, and effectively reduces the ability of the IR sensors to detect when and how the display panel is being touched. For example, the ability to determine the relative intensity or force associated with the touch is generally inversely proportional to the amount of ambient IR light, and is difficult when ambient light is changing.
Accordingly, there is a need for a large-scale multi-touch touchscreen system that is capable of detecting touches and movement associated therewith from styluses and other non-human objects. There is a further need for a multi-touch screen system that is capable of differentiating touches from different users. There is still a further need for a multi-touch screen system that is capable of operating in a more efficient manner in incandescent and natural light environments.