Smart mobile devices have developed in two opposite directions: miniaturization and interface large-sizing. For miniaturized devices such as smart wristbands and smart watches, there is the case where the screen interface is too small or has no interface, which brings inconvenience to the user in operating the device. For large-size mobile phones, such as smart phones or portable android device (PADs) having larger than 5 inches or more, there is the case where the interface is too large and the user's hands are too small so that the interface cannot be handled with one hand, and also it brings inconvenience to the user in operating the device.
Furthermore, there is a case that in the process of driving or eating, people may not be able to release their hands to operate smart mobile devices, or the hands with oil fail to operate smart mobile devices. Or, there is a case that a person with a disabling finger cannot operate a smart mobile device. Or, for some cases, it is inconvenient to operate the smart mobile device with two hands or a single hand, so that it brings inconvenience to the user in using the smart mobile device.
Nowadays, the operation on smart mobile devices, such as gesture recognition and unlocking, mainly include: a mobile phone unlocking scheme and a mobile phone control scheme.
The mobile phone unlocking scheme is generally used to activate the mobile phone interface. The unlocking of the mobile terminal and the smart wearable device is generally realized through a password, a slide to unlock, a long press to unlock, a fingerprint to unlock, a drawing to unlock and the like. The above unlocking method requires the finger to directly touch the screen, but has no way to perform the unlocking based on contact-free operation (i.e., space-free operation). Further, the available subject is mainly limited to a finger or a touch pen, but cannot be unlocked in contact-free way based on other parts of the human body.
The mobile phone control scheme is generally used to control the function of the mobile phone after the mobile phone interface is activated. The mobile phone is manipulated based on a direct touch on resistance screen or a capacitive screen, or an entity button is provided to achieve the corresponding operation. As for the mobile terminal, it is generally manipulated through direct manipulation by fingers, voice operations, and the like. The above operation of the mobile phone is directly performed through physical buttons, or the UI interface button of the mobile phone screen is touched by the finger, but contact-free manipulation cannot be performed.
There is also another method that the structured light is projected to the front surface of the human body through a laser emitter, and an infrared sensor is used to receive the structured light pattern reflected by the human body. Or, the processing chip calculates the spatial information of the human body of the object according to the position and the deformation degree of the received pattern on the camera, that is, the functions such as gesture recognition are provided based on the screen and graphic captures. The above method requires additional use of infrared sensors or cameras, requires a specific lattice model and is also limited to finger manipulation. Still, the functions can also not be operated in contact-free way by other parts of the human body.