Computing systems having touch input capabilities are now commonplace, for example in the form of mobile telephones and tablet computers. Such devices may be configured to detect input touches from a user, including touch inputs on the display surface by a user's finger, touch inputs on the display surface by a stylus and near-touch inputs directed toward the display surface.
Computing systems having touch input capabilities include touch pad devices and touch screen devices. For the case of touch pad devices, a cursor on a display screen is controlled by the movement of a user's finger or a stylus relative to the touch pad area. For the case of touch screen devices, the display screen is covered by a touch sensitive panel, thereby enabling a user to directly select to Graphical User Interface (GUI) objects on the screen by positioning their finger or a stylus over such objects.
In either case, the touch input capabilities may comprise a so-called tap-to-select input, in which the tap of a user's finger or a stylus on a relevant location on the touch pad or touch screen selects a function displayed on the selected region of the screen. One problem with tap-to-select inputs is the function to be selected must be displayed on the screen of the device in order to enable to user to tap the relevant location on the screen or touch pad. Accordingly, the number of different functions that a user may select is limited by the space available on the screen. This is particularly problematic for the case of smart phones or other portable devices having limited screen sizes.
Based on the above, there is a need for improvements in the way touch inputs are performed on touch sensitive devices.