Field of the Invention
The present invention relates to a method of measuring the position and orientation of an image capturing apparatus.
Description of the Related Art
Measurement of the position and orientation of an image capturing apparatus based on image information is used for alignment of a virtual object with real space in augmented reality/mixed reality, for self-location estimation of a robot or automobile, and for three-dimensional modeling of an object or scene.
A literature 1 (G. Klein and D. Murray, “Parallel Tracking and Mapping for Small AR Workspaces,” Proc. 6th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR '07), 2007) discloses a method in which information of feature points in a scene is held as a three-dimensional map, and in which the position and orientation of an image capturing apparatus are estimated on the basis of associations between feature points detected on an image and feature points in the three-dimensional map. In this method, a position and orientation measured in a previous frame are used for measurement of a position and orientation in a current frame. For example, a position and orientation in a current frame are predicted on the basis of a position and orientation measured in a previous frame and a motion model and used for associating feature points on an image with feature points in the three-dimensional map. Furthermore, the predicted position and orientation are used as initial values for iterative calculation for obtaining a position and orientation in the current frame.
The case where the image capturing apparatus moves significantly between the current and previous frames, or the case where the number of feature points detected on an image is extremely small may result in failure of measurement of a position and orientation. This requires a process of detecting failure of measurement of a position and orientation and performing recovery from the failure. In the above-described literature 1, the quality of position and orientation measurement is determined on the basis of the proportion of correctly associated feature points. If low-quality measurement continues through a certain number of frames, it is determined that position and orientation measurement has failed.
Since the method of determining failure of position and orientation measurement in the above-described literature 1 is a method based on how a scene looks on an image, if portions similar in appearance exist in different locations within the scene, a wrong determination may be made as to whether an estimated position and orientation are correct. Furthermore, in the case where the number of feature points is extremely small, a wrong failure determination may be made because the proportion of correctly associated feature points is small even if an estimated position and orientation are correct.