Human-machine interface has long been a subject of interest to designers of human operated machinery, particularly for machines or devices intended for “unskilled” novice users, such as personal devices of various kinds, remote controls, palm-sized computing devices (also referred to as personal digital assistants (PDA)), laptop computers, and so forth. Improved ease-of-use, in general, improves user satisfactions.
Increasingly, as a result of advances in microprocessor and other related technologies, more and more personal devices are processor based and multi-functional. For example, today one can acquire a PDA that can also serve as a wireless mobile phone, a MP3 player and so forth.
Typically, the appropriate end user interfaces, i.e. the interfaces for operating one of these devices as a PDA, a wireless mobile phone, or a MP3 player and so forth, are presented on a touch sensitive screen on an as needed basis. A user would interact with the interface by touching the appropriate interface element, a visual image, e.g. a key or button image, or a menu or list item image.
Many of these graphical interfaces are intuitive, and easy-to-use. However, as friendly as these graphical interfaces are, there is no tactile feel to the touching of the key or button image (i.e. a user does not feel the clicking of a real key/button). The same applies to the selection of menu or list items. The lack of tactile feedback is “difficult” or “less satisfying” for some user.
Thus, it is desirable if the user experience may be further enhanced by providing the user with tactile sensations when interacting with at least some of the interface elements.