1. Field of Invention
The present invention relates generally to the field of input devices. More specifically, the present invention is related to virtual input devices for electronic or computer based systems.
2. Discussion of Prior Art
With conventional keyboards, each character key occupies a specific position on the board. In order to type a character, a user must position their finger at the specified location of the key and press downward. Each of the keys have a finite size and a finite separation must be maintained between the keys so that a user does not strike two keys simultaneously. The need for finite spacing between the keys limits the operational amount a conventional keyboard can be reduced in size and still be operational.
For many electronic devices which require keyboard inputs, a conventional keyboard is not practical. Most portable devices are designed to be small enough so as to be convenient for the user to carry them without undue burden. Pagers, cellular phones and palm top PCs are typically designed so as to fit into a shirt pocket. Portable PCs are designed so as to be conveniently carried by the user in a small case or bag. These devices have limited surface space and accordingly their keyboards are designed with reduced sized keys, with fewer character keys, or the keys are arranged in a different manner than the traditional QWERTY style keyboard. Some devices are operated using a stylus interface. The user points the stylus at one character at a time on a graphic display of a keyboard in order to select the character. For a trained typist, such an interface is exceedingly slow and tedious. Thus it is desirable to have a virtual keyboard which can be typed in a traditional manner, yet is small enough to fit on the surface of a small electronic device.
Devices providing keyboard style entry, allowing touch typing without traditional hard-wired keys, can generally be separated into two groups. The first group focuses on sensing finger motion or gestures by analyzing the image from a monitoring TV camera or electronic glove worn by the user. The finger motions can be the traditional movement of fingers a touch typist would normally undergo to type a character, or specialized gestures such as xe2x80x9cpushingxe2x80x9d, xe2x80x9cpullingxe2x80x9d, or making a circle with the hand may be used for specific inputs. The use of the finger motions and gestures, in combination, provide for a wider range of inputs. The second group uses various measuring techniques to determine the position of the finger on the keypad. By using detectors, the X-Y position of the fingers are determined and, based on this X-Y position a character is input.
When a person taps an object with their finger, a sound is produced. For example, if a wooden board is tapped with a persons index finger we hear the sound produced. The sound is caused by the vibrations induced in the object being transmitted through the air to the human ear which then converts these vibrations into what we perceive as sound. Depending upon which finger is used to tap the object and the properties of the object tapped, different sounds are produced. The different sounds are the result of the vibrations having different characteristics such as different frequency components (spectral characteristics). The sounds are unique to the finger used to tap the object and the object tapped. These unique sounds are termed xe2x80x9cacoustical signatures.xe2x80x9d The present invention recognizes and employs these signatures in the embodiments described.
Some references exemplifying devices providing keyboard style entry without traditional hard-wired keys are discussed below, however, each reference fails to utilize acoustical signatures to determine input parameters.
The U.S. Pat. No. 5,767,842, describes an input device that records the image of the fingers using a TV camera and analyzes the motion of the hand.
The Japanese patent publication No:09054646 describes the use of an electronic glove to determine finger motion.
The Japanese patent publication No:06214712 describes an input device which detects finger motion using electromagnetic waves and ultrasonic waves to determine the key input.
The IBM TDB v.32 No. 10B describes a device that measures the position and motion of the fingers using a device such as a glove worn by the user to determine the characters typed.
The Japanese patent publication No:09330175 describes an input device which uses pressure sensors to locate the fingers and by examining the pressure distribution, finger positions are located. The use of acoustical signatures is not disclosed in this reference.
The U.S. Pat. No. 5,378,069 describes an input device using emitters and detectors to determine the positions of the fingers. The positions of the fingers are used to determine which key is pressed. The use of acoustical signatures of the fingers is not described.
The IBM TDB v20 No.7 describes a touch pad which uses two sheets of flexible material, acoustical wave generators, and acoustical wave receivers. A finger touch to the pad couples the acoustical wave to the top sheet and by measuring the amount of time it takes the wave to reach the acoustical receivers, the position of the finger can be detected. However, this reference does not utilize the unique acoustical signature generated by the fingers.
The IBM TDB v36 No. 11 describes a adaptive algorithm to adjust key positions to fit an individual""s preferences. This reference, however, does not describe the physical implementation of a virtual keyboard.
The IBM TDB v20 No. 4 describes a keyboard which has slight indentations in the keyboard in the positions of the keys. By positioning a finger on an indentation, a light sensor detects the position of the finger on the board and determines the key pressed.
The U.S. Pat. Nos. 5,764,794 and 5,058,046 describe stylus type input devices. Pens are used to interface with keyboard or other images on a display device to choose the data to be input.
None of these references teach the concept of recognizing a particular finger by its unique acoustical signature. The use of acoustical signatures allows the analyzing of a reduced set of data with respect to image analysis techniques and eliminates the need for gloves or the like.
Whatever the precise merits, features and advantages of the above cited references, none of them achieve or fulfills the purposes of the present invention. These and other objects are achieved by the detailed description that follows.
A virtual input device uses an acoustical signature of a user""s finger to determine which character is selected. Acoustical sensors having different acoustical properties are placed in spaced relationship to each other. When a user touches a sensor with a finger, a unique acoustical signature is produced. The acoustical signature is analyzed to determine the finger used, sensor touched, and the specific action by the finger, e.g. slide, press, tap. The combinations of the sensor, finger and action define the character selected. The characters are associated with each combination of sensor, finger and action in such a way that a traditional keyboard is maintained so that a proficient touch typist does not need to re-learn key positions. Visual feedback of the character selected is provided by a display device of the system.