This invention relates to methods and apparatus for identifying a predetermined movement of an object; in particular, in some embodiments, for identifying a gesture by a human hand for the purposes of providing a gesture-based user interface to an electronic device.
It is known to track the movement of an object, such as a user's finger or hand, by transmitting a succession of signals (e.g. ultrasound pulses) from one or more transmitters, receiving reflected signals at one or more receivers, and tracking movement of one or more objects by analysing changes in the received signals over time. It has been proposed to apply such technology to user interfaces for electronic devices, enabling, for example, a finger tip or hand to be moved in order to control an on-screen cursor. Arrangements similar to this are described in U.S. Pat. No. 5,059,959 (Barry) and U.S. Pat. No. 6,313,825 (Gilbert).
Such an approach, however, requires a high level of computational processing in order to track the object. This need for dedicated processing and electrical power is undesirable; particularly so in the context of resource-constrained mobile devices. The Applicant has realised that such a resource-intensive tracking approach is not necessary in some circumstances, particularly when the movements of interest take place close to a screen or other defined surface. Furthermore, conventional tracking methods may have difficulty in discerning the direction of an object using baseline time-of-flight or array based methods due to the speed at which the object moves, and its continually changing ‘shape’ relative to the transducer setup. Conventional tracking methods also rely on there being a clear point reflector or a clear front to track.