Many types of input devices are presently available for performing operations in a computing system, such as buttons or keys, mice, trackballs, touch panels, joysticks, touch screens and the like. Touch screens, in particular, are becoming increasingly popular because of their ease and versatility of operation as well as their declining price. Touch screens can include a touch panel, which can be a clear panel with a touch-sensitive surface. The touch panel can be positioned in front of or integral with a display screen so that the touch-sensitive surface covers the viewable area of the display screen. Touch screens can allow a user to make selections and move a cursor by simply touching the display screen via a finger or stylus. In general, the touch screen can recognize the touch and position of the touch on the display screen, and the computing system can interpret the touch and thereafter perform an action based on the touch event.
Touch panels can include an array of touch sensors capable of detecting touch events (the touching of fingers or other objects upon a touch-sensitive surface). Some touch panels can detect multiple touches (the touching of fingers or other objects upon a touch-sensitive surface at distinct locations at about the same time) and near touches (fingers or other objects within the near-field detection capabilities of their touch sensors), and identify and track the locations of the touches. Examples of multi-touch panels are described in Applicant's co-pending U.S. application Ser. No. 10/842,862 entitled “Multipoint Touchscreen,” filed on May 6, 2004 and published as U.S. Published Application No. 2006/0097991 on May 11, 2006, the contents of which are incorporated by reference herein.
As mentioned above, a display screen can be located beneath the sensor panel. A user interface (UI) algorithm can generate a virtual keypad or other virtual input interface beneath the sensor panel that can include virtual buttons, pull-down menus and the like. By detecting touch events at locations defined by the virtual buttons, the UI algorithm can determine that a virtual button has been “pushed.” The magnitude of analog channel output values, indicating the “degree” of touch, can be used by the UI algorithm to determine whether there was a sufficient amount of touch to trigger the pushing of the virtual button.
Ideally, a particular amount of touch should generate an analog channel output value of the same magnitude regardless of where the touch event occurred on a sensor panel. However, because the electrical characteristics of the sensors in a sensor panel are likely to vary due to processing variations, manufacturing tolerances, assembly differences (which can be due to the location of the sensors in relation to the edges and shape of the sensor panel), aging, stress, dirt, moisture, deformation due to pressure, temperature, expansion of materials and geometries, and the like, the magnitude of the analog channel output values can vary from location to location within the sensor panel. This can lead to inconsistent or false triggering of virtual buttons or non-triggering of virtual buttons, and a difficult user experience as the user discovers that certain areas of the sensor panel require more or less touching in order to trigger a virtual button.