Many handicapped persons are extremely limited in their ability to communicate due their inability to control limb movement. Attempts (some successful) have been made to provide ways requiring minimal muscular input from the operator to control the performance of simple routine tasks and thereby better enabling disabled persons to have a further degree of independence. Generally existing systems are relatively slow and difficult or tedious to operate, particular over sustained periods.
In a limited number of cases a computer has been successfully used by handicapped people, however such use is limited by the lack of an appropriate interface between the computer and the handicapped person. Normally in-putting the computer requires some degree of manual dexterity and involves for example keyboard or computer mouse input. When the hands are restricted or not available these input devices are not satisfactory.
Some solutions assume the that user has normal function of the neck and head, but there are many disabled persons that can not control those muscles. Some are able only to control their mouth and tongue and some also are inflicted with spastic motion, making the interface between such persons with a computer even more difficult.
The most common solution to the interfacing problem for persons having control of head and neck is the tapping stick. The user simply grips one end of the stick in his mouth and taps the opposite end on the keyboard. Users can become quite proficient and the cost of this solution is relatively low since no modifications to the computer are required. However, the system is limited as such a user faces great if not insurmountable difficulty in operating the control functions that require the manipulation of more than one key simultaneously.
Those individuals who do not have these faculties and can only control their mouth have few avenues available. Speech recognizing systems have been used, but are extremely expensive and very limited in effectiveness. The most popular interface uses a puff-suck tube which is held in the mouth like a straw and a sequence of pressure fluctuations caused by blowing and sucking are detected and used, for example, as a switch (binary system) or to generate a sequence representing some form of code such as Morse code. These code signals are then translated into computer inputs.
None of these systems are suitable for cursor control in the way a computer mouse for example can be used to interface with a computer. If the handicapped person has the ability to move the cursor over the screen of the monitor and then at the appropriate location to make a selection by activating for example an on-off switch as with the computer mouse a whole new realm of communication would be possible. This would enable a severely handicapped person to operate much more complicated software programs such as graphics or drawing packages or the like and could also be used to control various appliances throughout the house such as the telephone, lights, stereo, television, etc. by activating same through window displays on the computer.
It has been suggested to use the tongue as the operating element for controlling a potentiometer as described in U.S. Pat. No. 4,728,812 issued to Sheriff et al Mar. 1, 1988. This device controls the operation of a machine by jaw and tongue movement of the operator. The control is effected by movement of the jaws which moves a pair of levers to adjust the contacts of a potentiometer. A tongue actuated micro switch is provided for added control. Such a device is useful for some purposes.
U.S. Pat. Nos. 4,484,026 to Thornburg issued Nov. 20, 1984 and 4,529,959 issued to July 16, 1987 to Kazuhiko et al disclose various digitizer control pads. Interlink Electronics 1110 Monk Avenue, Carpinteria Calif. 93105 offer a line of force sensing resistors and XY and XYZ digitizer pads that may be applied to various applications.