Recently, techniques such as Augmented Reality (AR) and the Internet of Things (IoT) have been actively developed for linking the actual world and virtual information so as to achieve interactions. Interface apparatuses have been developed that identify the positions of a real object and a finger using a range sensor so as to enable an interaction between the real object and the finger. The interaction between the real object and the finger means actions such as touching the object or tracing fingers over the object.
A technique is also known for decreasing erroneous detections of operations or designated positions based on a user's hand operations in an environment where objects included in an input image are moved by the user's operations (see, for example, patent document 1).
For an interaction between an actual object (target object) and a finger, the target object is installed to perform a calibration process in advance. The calibration process is a process of placing a target object and of registering the position of the object, e.g., the distance between a range sensor and the target object, with the hand removed from the shooting range of the range sensor. The calibration process is also performed when the target object has moved.
Performing the calibration process every time a target object is installed or moved is very burdensome and leads to a problem of usability loss.    Patent Document 1: Japanese Laid-open Patent Publication No. 2015-22624    Patent Document 2: Japanese Laid-open Patent Publication No. 2001-282456    Non-Patent document 1: Wataru WATANABE and three others, “Development and Improved Operability of Projection Plane Touch UI Based on Projector and Depth Camera”, 21st Image Sensing Symposium, June 2015