Touchscreen computer interfaces are today present in virtually every place, public and private. Found in such diverse devices as hand-held computers and informational kiosks, touchscreens help everyone from scientists to tourists to input and extract electronically stored information. Touchscreens allow a computer user's finger or stylus to act as an input device, making them extremely useful in applications where a mouse or keyboard would be either impractical or impossible.
Just as the general public intuitively grasps the operation of touchscreens from a user's standpoint, those of skill in the art will similarly recognize the basic operational technology underlying touchscreen interfaces, and how touchscreen device drivers communicate with operating systems. Known touchscreens use various physical or electrical attributes of the screen to sense inputs. Among these are resistance, capacitance, temperature, pressure, vibration, and motion. Programs to implement a graphical user interface (GUI) and operating system are provided in the central processing unit of the computer device with which the touchscreen is associated. These programs typically include a device driver that allows the user to perform such functions as selecting and opening files, moving icons, and inputting commands and other data through the display screen of the computer itself. The allowed inputs are usually similar to those that can be accomplished with a mouse or other standard input device.
Although touchscreen inputs and standard input device inputs are similar in some respects, there are significant practical differences. For example, in systems using a conventional input device such a mouse, there is typically only one port or conduit through which the kind of input expected may be received. For example, since there is usually only one mouse associated with a PC, all point-and-click input comes from the mouse. There is little to no chance of confusing mouse input with input from other sources.
Unfortunately, such confusion is almost ubiquitous in most touchscreen applications. Known touchscreen devices have no way of discriminating in time and space between palm and finger, or stylus and thumb, as they touch the screen in combination. This is due to the fact that, in typical touchscreen applications, it is impossible to effectively distinguish between multiple, simultaneous activation points on the touchscreen. Thus it often occurs, especially in large-format touchscreens, that unintended inputs result from, for example, a palm resting on the touchscreen surface in conjunction with a stylus applied to the touchscreen.
There have been attempts in the art to distinguish, in limited ways, between various types of screen touches. One such an attempt may be seen in U.S. Pat. No. 5,764,222 to Shieh, which describes a "method, apparatus, and article of manufacture" creating a "virtual pointing device" on a touchscreen. This patent suggests using the touchscreen to measure various dimensions of a user's hand, then recording these measurements. The computer can then be programmed to assign a function to a touch from each of the respective hand portions of a measured user. The computer keeps a database of each user's unique measurements, as well as a "default" file representing a "generic" set of measurements.
Although this patent suggests a way to distinguish between and among some types of screen touches, it does not address the problem of classifying or characterizing these inputs when they are made substantially simultaneously. Nor is the described system practical for applications used by a wide cross-section of individuals in public places, such a kiosks or ATM's. More importantly, there is no provision made for those instances in which two fingertips, or a stylus and a knuckle, of the user hit the screen at about the same time. Thus, the computer may respond in a way that the user did not intend, or fail to respond at all.
It is apparent from the foregoing that the need exists for a simple and efficient touchscreen operational arrangement that will facilitate touchscreen use by effectively distinguishing between multiple, simultaneous activation points on the touchscreen.