The current state of the art for formation flying relative positioning and rotational control to an inertial reference consist of the following:                (1) radio ranging to ground stations for orbit control with various on-board sensors to determine the inertial reference for reconstruction on the ground;        (2) laser ranging with various on-board sensors to determine the inertial reference for reconstruction on the ground;        (3) GPS and radio ranging, plus various on-board sensors to determine the inertial reference for reconstruction on the ground; and        (4) various optical (visible to IR wavelength) sensors to determine relative knowledge for rendezvous and docking operations with various on-board sensors to determine the inertial reference.        
The aforementioned techniques however do not provide relative state knowledge and the inertial reference, simultaneously. The reconstruction of this information requires multiple sensors to combine this information to control the spacecraft formation. This reconstruction must also account for errors caused by mis-alignments between multiple sensor mounting locations, as well as other on-orbit sensor behaviors that contribute to errors affecting the science measurements and/or imaging quality.
Currently, a sensor that combines both, the relative and inertial spacecraft knowledge in the same image frame does not exist. Thus, a sensor, including an algorithm, that solves the above-mentioned problems may be beneficial.