1. Field of the Invention
The invention relates to a recognizing method and controlling method and a device thereof, and more particularly relates to a gesture recognizing and controlling method and a device thereof.
2. Description of Related Art
Mouses, keyboards and joysticks are conventional interfaces for human-computer interaction. Along with the continuous development of new technologies such as touch control and voice control, which are developed to further improve convenience in human-computer interaction. Somatosensory control is a brand new input method aiming to provide human-computer interaction with more user-friendly. Among which, gesture recognizing is one of somatosensory control since gesture is intuitive and convenient for people to communicate with one another in daily life. Lately, consumer attention being gradually focused on gesture recognizing which has been applied to various fields such as human-computer interaction design, medical rehabilitation, virtual environment, digital art design and gaming design.
Information for gesture recognizing is mainly classified into two types: dynamic gesture and static gesture. Dynamic gesture information includes a hand movement trajectory, position information and a timing relation, and the static gesture information mainly relates to a variation to a hand shape. By analyzing said gesture information, human-computer interaction functionality may be achieved according to different gestures. A method for gesture recognizing utilizes a depth camera to obtain images with depth information, pre-processes such as image binaryzation, image background deletion and noise elimination are required for each image, so that information related to hand position and gesture of the user may be captured from a series of images. Later, image coordinate value of the hand position may be used to control a cursor of the display. Since pre-processes require more time consuming, it is difficult to match the same speed and accuracy for moving the cursor with a mouse. Therefore, it is critical to improve an interface for gesture controlling so as to achieve a goal of human-computer interaction in real time.