As computer processors have decreased in size and expense, mobile electronic devices, such as mobile phones, tablets, etc., have become increasingly widespread. Designed to be portable, many mobile electronic devices are lightweight and small enough to be worn or carried in a pocket or handbag. However, the portability of modern mobile electronic devices comes at a price: today's mobile electronic devices often incorporate small input devices to reduce the size and weight of the device. For example, many current mobile electronic devices include soft QWERTY keyboards displayed on a touch screen that many people (especially those with poor dexterity) find difficult to use.
Gesture-based input means provide an alternative to conventional input means (e.g., keyboards, touch screens, etc.) provided on mobile electronic devices. Moreover, gesture-based input means are not constrained by the size of the mobile electronic device and, thus, can provide a larger area for inputting data to the mobile electronic device.
One gesture-based input means that has recently been proposed is the “Skinput” concept, in which a user interface is projected on a surface of the user's arm and the user performs gestures on the surface of the arm in the form of “taps”. The location of the “taps” is determined based on acoustic data obtained from acoustic sensors. An input command then is issued based on the determined location.