The disclosure relates to a system and method for using an invisible interface for receiving a non-contact input signal, such as a non-articulated change in shape, for controlling a device. Although the present system is discussed in the context of a gaming application, the embodiments discussed herein are amenable to other scenarios that operate using a human-machine interface.
The Human-Computer Interface (HCI) is a communication paradigm between the human user and the computer. The user essentially operates and controls the computer or device through various inputs, made at the HCI, which are provided as control signals transmitted to a computer processor for generating a particular action. Conventional HCIs accept input via mechanical contact devices, such as, computer keyboards, mice, and touch screens, etc. Assistive technology includes assistive (and adaptive and rehabilitative) devices that enable people with disabilities to perform tasks using an alternative input device (alternative HCI), such as electronic pointing devices, joysticks, and trackballs, etc. For example, Sip-and-Puff technology is a type of assistive technology that enables a user to control peripheral devices using mouth-controlled input, such as air pressure, particularly by inhaling or exhaling on a straw, tube, or wand. Also known is a non-contact (pointing) input device which responds to the volume—associated with a pressure—of the user's controlled breathing signals directed into a microphone. Similarly, a breath signal controller uses a sensor to measure pressure resulting from a user inhaling and exhaling air. Regardless of the attribute being sensed and/or measured, technology has not advanced greatly toward applying the breath pressure and/or attribute as an input signal to a controller for controlling a device.
Recent developments in the gaming industry enable detected movements to be applied as an alternative form of input. Motion-input devices determine relative motion (via accelerometers), absolute motion (via body or controller localization) and body posture (via image analyses and depth maps) parameters, which can be used to provide input signals to a gaming console.
New approaches to sensing and applying attributes as device inputs can provide useful options to a number of industries, including the healthcare and gaming industries. A Natural User Interface (“NUI”) is an interface that is effectively invisible and relies on the user—as opposed to an artificial interface or control device—interacting with the technology. In other words, the user (i.e., the human body) is the interface, and the input signals applied to a processor controlling the device are associated with observed (intentional or unintentional) actions of the user. NUIs are characterized by shallow learning curves where the interface requires learning, but the user generally experiences a quick transition from novice to expert.
Neither the motion-input devices nor the human-machine interface devices are known to apply non-articulated changes in shape as an input attribute for controlling signals to a controller of the device. A NUI (i.e., an invisible interface) is desired to exploit gestures in body motion for controlling a device.