1. Field of the Invention
The present invention relates to an augmented reality-based hand interaction apparatus and method using image information, and more particularly, to an augmented reality-based hand interaction apparatus and method using image information which enable an image-based augmented reality system to provide intuitional interaction between an augmented three-dimensional (3D) virtual object and a user.
2. Discussion of Related Art
User interfaces (UIs) used in two-dimensional (2D) planes are employed as UIs used in existing 3D televisions (TVs), augmented reality, and virtual reality as they are, and may be used in a virtual touch manner or used by moving a cursor.
In augmented reality or virtual reality, menus are in the form of icons and managed in a folder or another screen, that is, in an upper level. Also, it is possible to see sub-items of the corresponding menu in a drag-and-drop manner or by means of selection.
Meanwhile, an augmented reality system according to related art provides the sense of immersion by synchronizing virtual content with real space. However, when such a physical interaction device is used, interaction with a user does not occur in augmented three dimensions, but input/output is performed through a display, so that the sense of immersion is lost.
As other related art, there is an interaction technique for recognizing a hand based on a red, green, and blue (RGB) image. According to this technique, a hand region is found in an RGB image using a skin color model, and a misrecognition occurs when a color similar to a skin color is included in an object which is a target of augmented reality.
Also, after a hand is specified using an RGB image alone, coordinates of a fingertip, the center of the hand, etc, are found. Therefore, it is neither possible to know an accurate 3D position nor to produce occlusion effects between pieces of augmented 3D content.