1. Field of the Invention
The present invention generally relates to an apparatus and method for sensing a three-dimensional (3D) object, and more particularly, to an apparatus and method for sensing a 3D object, in which accurate sensing of a 3D object displayed as a stereoscopic User Interface (UI) on a stereoscopic space can be provided.
2. Description of the Related Art
With the development of three-dimensional (3D) displays for portable terminals, a user interface is stereoscopically displayed as a 3D object on a virtual stereoscopic space. Techniques for displaying interaction with a 3D object on a virtual stereoscopic space include a first scheme using a touch sensor and a second scheme using proximity touch or input on a space.
The first scheme using a touch sensor stereoscopically displays a User Interface (UI) and uses a touch-based two-dimensional (2D) coordinate input scheme. The second scheme using proximity touch/input on a space may obtain 3D space coordinates. In the second scheme using proximity touch/input on a space, input coordinates on the space are connected with space coordinates of a 3D object for interaction with the 3D object on the 3D space.
When the touch sensor is used, in interaction with a 3D object shown in front of a front surface of a display unit, an input object such as a hand, etc., has to pass through the 3D object, causing visual discomfort. To prevent such a phenomenon, the 3D object is displayed on a rear surface of the display unit. In this case, a stereoscopic effect is degraded and 3D object configuration may be subject to some restrictions.
FIG. 1 is a diagram for describing a state in which a 3D object displayed on a virtual stereoscopic space is displayed in an overlapping manner with an object such as a hand, etc. according to the related art.
Referring to FIG. 1, a 3D object displayed on a virtual stereoscopic space is displayed with overlapping an object such as a hand, etc. A 3D object displayed as a stereoscopic UI is displayed on a virtual space A on the front surface of the display unit and a virtual space B on the rear surface of the display unit. When a user touches the display unit to select a corresponding 3D object, the user's hand passes through the 3D object such that the 3D object and the user's hand are displayed in an overlapping manner.
When the proximity touch/input on a space is used, 3D space coordinates are detected for interaction with the 3D object, such that the foregoing problem does not occur, but due to limitations in a detecting method and technique, sensing resolution decreases as a distance between the display unit and the user increases. For example, at a height of 1 cm from the front surface of the display unit, 5×3 icons may be distinguished and selected at an accuracy of 99%; at a height of 3 cm, the accuracy decreases to 80%, such that to maintain the same accuracy of 99%, only 3×2 icons can be displayed.
FIGS. 2A and 2B are diagrams for describing an error range according to a distance between a proximity sensor and a hand according to the related art.
Referring to FIG. 2A, for a short distance between the display unit and the user, a region which can be accurately identified and detected is small, such that many 3D objects may be displayed. In FIG. 2B, for a long distance between the display unit and the user, a region which can be accurately identified and detected is large, such that fewer 3D objects may be displayed as compared to FIG. 2A.
FIGS. 3A and 3B are diagrams for describing sensing accuracy with respect to a distance between a proximity sensor and a hand.
FIG. 3A shows the number of 3D objects which can be displayed according to a distance Z between the display unit and the user, and FIG. 3B shows sensing accuracy according to the distance Z between the display unit and the user.