A number of electronic devices are known in the art which utilize a touch screen for the user interface. For example, laptop computers, tablet computers, handheld gaming devices and mobile telephones (specifically, smart phones) conventionally include a display screen which incorporates a touch screen user interface.
The provision of motion sensing circuitry in such electronic devices is also well known. Exemplary motion sensors include one or more of an accelerometer (for acceleration detection), a gyroscope (for orientation detection), a compass/magnetometer (for direction detection), a location (for example, GPS) sensor (for location detection), a pressure sensor (for elevation detection), and the like. These motion sensors may provide an additional means for enabling the user to interface with the device and in particular control execution of applications running on the device or services provided by the device.
Published United States Application for Patent No. 2012/0050176 (Mar. 1, 2012) to Chin discloses an electronic computing device with a touch sensitive display screen and an accelerometer. A touch signal (obtained from the touch sensitive display screen) and an acceleration signal (obtained from the accelerometer) are processed by a processor of the electronic computing device to provide combined user interface control signaling which is indicative of not only the location on the screen at which a touch is made, but also an impact of that touch. A higher level application being executed by the processor is responsive to the combined user interface control signaling. A specific example provided by Chin relates to a musical instrument application (specifically a piano) and the use of the combined user interface control signaling to specify the particular key that has been struck (touching detection) along with an indication of impact to modify the audible volume of the note tone which corresponds to the struck key (acceleration detection).
Chin further notes that the acceleration signal contribution to the combined user interface control signaling is adversely dampened if one or more other fingers are simultaneously resting on the touch sensitive display screen. To address this problem, Chin teaches the scaling of the acceleration signal contribution as a function of a distance between the position of the currently sensed touch and the position of one of the other fingers that are simultaneously resting on the touch sensitive display screen. The calculated scaling factor is then applied to further modify the audible volume.
There is a need in the art for an improved means for generating the combined user interface control signaling from a touch screen sensor and one or more other motion sensors for user interface control.