Camera tracking is a technology for analyzing an input image of a camera to estimate state variables of the camera at the time of imaging. According to application fields, state variables of a camera may only include motion variables, e.g., movement and rotation of the camera, or, may further include internal variables, e.g., a focal length, a principal point and an aspect ratio. The estimated state variables are loaded into software, e.g., Maya and Max, and then, a user can image a CG (Computer Graphics) object by moving a virtual camera according to the state variables to synthesize the CG object with an actual imaged background. Such a method is widely used in producing movies and commercial films.
Real-time camera tracking, i.e., camera tracking performed simultaneously with imaging, is widely used in, e.g., unmanned robot driving, augmented reality applications and real-time synthesis preview. In the real-time camera tracking, the number of features of state vectors increases as an area within which a camera moves widens. Further, the sizes of a feature map and the state vectors increase as time goes.
Researches have been carried out to achieve stability and accuracy of the real-time camera tracking. However, conventional real-time camera tracking still has a drawback in that a wide moving area of a camera results in memory lack and decrease in computation speed.