The present invention relates to a system and method that generates outputs based on the position of a sensor. The position of the sensor can be related to many variables but is preferably based on the movement or gestures of a person.
According to one embodiment of the present invention the outputs are preferably in the form of sounds which can be played based on the gestures of a person to produce music.
An example of an electronic musical instrument that plays sound based on movement and gestures of a person is the “Virtual Air Guitar” system that is currently featuring at the Heureka Science Centre in Finland. The Virtual Air Guitar system utilizes visual recognition technology whereby the hand movements and gestures of a user wearing particular gloves are observed by a webcam and analysed by gesture recognition software. The software monitors the relative spatial position between the user's hands and assumes that the right hand is located on the body of a virtual guitar and that the left hand is located in the fret of the virtual guitar. Movement of the right hand in an upward and downward direction is interpreted as simulating strumming of the guitar while movement of the left hand toward and away from the right hand is interpreted as simulating movement of the left hand along the fret arm of a guitar.
The system enables the user to make gestures or actions as if they were playing an actual guitar to produce sound within the mathematical constraints of the model. However, the optical model has inherent complications arising from a single camera source, e.g. it would be very difficult to play the Virtual Air Guitar behind one's head or if any other object blocks the operator's hands from the field of view of the webcam.
Furthermore, in our view it would be difficult to usefully employ the sound produced by the system in a musical performance that is able to be reliably reproduced or played to accompany other musical instruments.