CPC G06F 3/017 (2013.01) [G06F 1/163 (2013.01); G06F 3/011 (2013.01); G06F 3/013 (2013.01); G06F 3/014 (2013.01); G06F 3/015 (2013.01); H04W 4/021 (2013.01); H04W 4/026 (2013.01); H04W 4/027 (2013.01); H04W 4/029 (2018.02)] | 20 Claims |
1. A system for gesture-based control, the system comprising:
a wearable device configured to be worn on a body part of a person and comprising:
a plurality of biopotential channels comprising at least a first biopotential channel and a second biopotential channel, each biopotential channel of the plurality of biopotential channels comprising a pair of electrodes and being configured to output signals indicating biopotentials at a respective location on the body part of the person, the output of at least the first biopotential channel being configured to vary in response to motions or intended motions of one or more fingers of the person;
a location sensor, the location sensor being configured to output data indicating a location of the body part; and
a processor;
wherein the system is configured to:
generate a data stream based on the outputs from the plurality of biopotential channels and/or the location sensor,
enter a first state in which the output from the location sensor is processed according to a first set of logical rules;
classify a gesture based on an analysis of an analytical segment of the data stream, the gesture classification comprising:
(a) determining that a first portion of the analytical segment shows a baseline measurement for a parameter related to one or more of the plurality of biopotential channels;
(b) determining that a second portion of the analytical segment following the first portion indicates a change in the parameter, relative to the first portion, for the one or more of the plurality of biopotential channels;
(c) determining that a third portion of the analytical segment following the second portion indicates that the parameter remains in a changed condition, relative to the first portion, for the one or more of the plurality of biopotential channels; and
(d) determining that a time between the second portion and the third portion is greater than a threshold period of time; and
based on the gesture classification comprising steps (a)-(d), transition to a second state in which the output from the location sensor is processed according to a second set of logical rules that is different than the first set of logical rules.
|