It is known that there are methods and models to track a three dimensional object in an environment and compute its position and orientation with respect to a predetermined coordinate system. These kinds of tracker systems are used for example in aircrafts to determine the orientation of head of the pilot. Once the orientation is acquired with respect to the coordinate system of say the display devices, then it is possible to generate graphics on these accordingly. There are different methods to track an object in the scene using magnetic, mechanical or optical means. Currently, the spatial relations of objects may also be determined using magnetic sensors or laser beams but this invention relates specifically to systems using camera-based (day-tv, thermal, IR, Time of Flight etc.) trackers.
In one of the optical camera-based systems the pilot wears a helmet with marks or patterns on and at least one tracker camera determines the helmet's position and orientation using coordinate transformation calculations based on these patterns. Computing spatial relation between an object having a tracking pattern, and a camera is therefore, well known in the state of the art. Throughout the document, whenever a spatial relation is mentioned, it should be understood that the relation between an entity's predetermined reference system with respect to the other's is meant. This reference system is generally based on the respective pattern of an object under consideration. Since the position of the tracker camera with respect to the other systems is known (or can be calculated or measured), it is also possible to compute the helmet's spatial relation with the tracker camera's sensor and then with other systems. Computing spatial relation between different identities, given their spatial relation with respect to a known reference frame is also possible with related vector translations and matrix calculations. In the same manner, similar concepts can also be applied to robotic systems to determine the spatial relation of various arms of the robotic product. Nevertheless, a tracker system using a tracker camera tracks an object (say tracked object) and calculates its position and orientation with respect to a known reference frame. Then this relative relation of the tracked object is used for many different purposes. In this context, “tracked object” means an object having a tracking pattern and being tracked by a tracker system. It may be either a helmet as in a helmet-mounted tracker system or any other object.
The patterns used in camera-based tracker systems are either graphical (generally black and white) patterns (passive marker) tracked by visible light cameras or arrays of infrared LEDs (active marker) tracked by infrared cameras. Other arrangements are also possible but the most convenient among them is the one with the infrared LEDs since these systems can work under inappropriate lighting conditions. There are also some problems related to these tracker systems such as calibration or accuracy determination. Calibration and testing of such systems are generally cumbersome, difficult and require complicated equipment. Furthermore, sometimes instead of calibrating the tracker system, it is essential to determine the system's current state's accuracy,
In a currently used calibration method, a laser tracker is used to determine the orientation of different objects relative to a reference point in the scene and the system under consideration is arranged so that it is consistent. Although this method results in an accurate calibration, it uses a laser tracker which is an expensive solution when it is only required to determine the current accuracy of the system under consideration.
Another method which is currently used to calibrate a head tracker system uses specially adapted mechanisms to change the positions of the markers on the scene and then configures the tracker system using these data, generating a coordinate system. Again this method is not practical and uses special equipment which makes the process complicated. When it is only necessary to determine the current accuracy of the system, the error of the current system must be deduced.
The current methods are not offering a simple and efficient way of only measuring a tracker system's accuracy while tracking an object. To provide a solution to this problem, a new methodology should be introduced which uses simpler tools and steps.
The Chinese patent document CN101524842, an application in the state of the art, discloses a method calibrating a robot, taking an initial test point as a reference, and establishing a test coordinate system with a laser tracker, tracking curve arc with the laser tracker, and recording position coordinates measured from clockwise direction and counter-clockwise direction.
The Japanese patent document JP2009085806, an application in the state of the art, discloses a method for accurately setting a camera coordinate system and calibrating the head tracker system by moving the optical marker with a stage mechanism and taking measurements.