As handheld communication devices become part of everyday life, device manufactures and service providers strive to enhance the versatility and performance of such devices.
Handheld communication devices in the art (e.g., mobile phones, pagers, personal digital assistants (PDAs), and the like) typically use auditory and visual cues to alert a user when incoming messages, such as voice calls and emails, are received. Such auditory and visual alerts, however, have the disadvantages of being distracting in some situations (e.g., during driving), or annoying in others (e.g., during a meeting or a concert). Likewise, they are insufficient in other situations such as a noisy environment (e.g., at a pub or in a crowd) or when a call is dropped and the user continues talking without realizing that nobody is still listening.
Although vibratory alerts are made available in some communication devices such as cellular phones, such vibratory effects cannot in accordance with the known prior art be customized or personalized according to applications, and thus are capable of conveying little information to the user. A need, therefore, exists in the art for a new sensory modality that delivers information to users of handheld communication devices in a personalized fashion.
Such sensory modality would help cut through communication clutter by prioritizing, categorizing, or highlighting messages and content as they are received. It would provide mobile phone users with better control over their handset and communications through greater flexibility, accuracy, and speed.
Moreover, engaging the sense of touch would enhance the reality of user experience. Touch unifies the spatial senses; those you use to navigate your way through the world—sight, sound, and touch. Touch produces reflex-rate response in milliseconds and supplies a completeness that sight and sound together can't replace. In short, touch makes an experience truly personal.