1. Field of the Invention
The present invention relates to an image processing device for processing an image obtained by image-acquiring an object.
2. Description of the Related Art
Generally, there is distortion in an image optically obtained by an image acquisition apparatus such as a camera. As a method of removing this distortion, therefore, various technologies have conventionally been proposed. However, even when such a technology is applied, a screen periphery includes more errors.
An image is discrete information, and even one-pixel shifting of a point located at a remote place on the image enlarges a real-space error.
Accordingly, position and size in the image are very important for calculating position and orientation of the camera.
Conventionally, many methods of obtaining position and orientation information of the camera have been proposed.
For example, U.S. Pat. No. 6,577,249 B1 discloses position and orientation calculation which uses a marker. This is a technology for enabling checking of positions of a reference marker and markers arranged on its periphery, and/or feature portions by reading information contained in the reference marker to schematically obtain a position and an orientation of a camera. According to the technology disclosed in this USP, by checking positions of four or more markers and/or feature portions, it is possible to estimate a position and an orientation of the camera by using this information. Additionally, this USP discloses that in a related information presentation device for superimposing and displaying pieces of related information of portions of an object in a corresponding position on an acquired image, when the pieces of related information are superimposed and displayed, a superimposed displaying position of the pieces of related information to be presented is decided based on the position and the orientation of the camera thus estimated, i.e., in accordance with a visual field of the camera.
“An Augmented Reality System and its Calibration based on Marker Tracking”, pp. 607-616, Journal of The Virtual Reality Society of Japan (TVRSJ), discloses position and orientation calculation which uses a marker alone.
US 2002/0191862 A1 discloses camera position and orientation calculation based on natural features (NF). A known shape M of a position and an orientation is used as shown in FIG. 1A, a position and an orientation of the camera are calculated by a procedure shown in FIG. 1B, and further a three-dimensional position of each feature point is calculated.
First, the following processing is executed in step S101. That is, by recognizing the known shape M in an initial position PO0, a position and an orientation of a camera 100 are calculated. Subsequently, the camera 100 is continuously moved while the known shape M is captured on a screen (visual field) from the initial position PO0. When the camera is moved to a position PO1, similarly by recognizing the known shape M, a position and an orientation of the camera 100 in the position PO1 are calculated.
Next, the following processing is executed in step S102. That is, based on the calculated positions and orientations of the camera 100 in the positions PO0 and PO0, a base line length BL1 between the positions PO0 and PO1 can be estimated. Accordingly, by a triangular surveying principle, a three-dimensional position of each feature point indicated by a black circle in FIG. 1A and continuously captured on the screen is calculated.
Subsequently, the following processing is executed in step S103. That is, the camera 100 is continuously moved from the position PO1. When it is moved to a position PO2, the known shape M cannot be captured any more on the screen. However, even in the position P2 in which the known shape M is not captured on the screen, a position and an orientation of the camera 100 can be calculated from the calculated position of each black circle feature point.
Accordingly, the following processing is executed in step S104. That is, a base line length BL2 between the positions PO1 and PO2 can be calculated. A three-dimensional position of even a feature point not captured on the screen in the position PO0 indicated by a white circle in FIG. 1A can be calculated.
US 2002/0191862 A1 discloses an auto-calibration (AC) method as a conventional technology.
ISMAR 2001 pp. 97-103 “Extendible Tracking By Line Auto-calibration” discloses camera position and orientation calculation based on tracking of a natural feature segment and AC.