Augmented reality (AR) systems are often implemented with specifically selected hardware and/or software components tested to ensure compatibility and performance. For example, an AR system may have proprietary color camera sensors and depth sensors engineered to provide output pairs of image frames (e.g., one color and one depth taken at equivalent viewpoints) at the same time. AR functionality may also be included as a feature in general or multifunction mobile devices, such as smart phones. However, these mobile devices typically are unable to synchronize output image frames from a color camera sensor with a depth sensor because the operating environment of current mobile devices do not support time synchronization or time stamps with accurate creation time identifiers from different device cameras. Output from current mobile device cameras are typically missing any reliable time stamp indicating time of creation.
Having unsynchronized cameras within a system can compromise, accuracy of AR output due to processing image pairs that may not be best matches. Cameras may be unsynchronized when a system is unable to determine when a camera sensor has captured a particular image. Systems within a mobile device for example may obtain camera output with varying amounts of delay from time of creation. For example, sensor image frames created at time T1 may not be available for processing by the AR components of the mobile device until the frames are received at a later time T2. The color and depth sensors may each have different time delays between creation of an image frame and output, and the frequency of output may also be different for each sensor. Therefore, new and improved techniques for processing color and depth images are desired.