Generally speaking, optical 3D dimensioning with structural light triangulation imaging (parallax) suffers accuracy loss introduced by variations in relative positions and orientations of a projector, camera, and projector-camera pair. These variations can result from thermal, structural, or other changes, such as component aging. The dimensioning accuracy problem can be partially solved with calibration, but the ultimate accuracy is still limited due to the non-calibratable part of variations, such as shock and vibration. In addition, temperature change of the system due to the ambient temperature change or self-generated heat may affect the triangular geometry. Temperature gradient change occurring due to the nonuniform heat-generating source and heat dissipation may introduce complex deformations to the triangular system geometry and individual components, and is hard to resolve by calibration. More specifically, changes in camera focusing and distortion may directly contribute to the 3D dimensioning error. Additionally, such changes are difficult to control or correct with calibration. Components of a camera module are usually made from multiple materials with significantly different thermal expansion coefficients (CTEs). For example, the materials may include silicon sensor with 3.5 ppm/C, glass lens ˜9 ppm/C, aluminum barrel and holder 22 ppm/C, plastic parts >60 ppm/C. Such a combination makes it virtually impossible to fully compensate for the changes in pattern image positions on the image sensor introduced by the thermal expansion.
To overcome this issue, instead of the standard projector-camera pair triangulation, a dual-pattern optical 3D dimensioning system utilizing two or more identical projecting patterns may be applied to generate 3D depth data from dual-pattern image captured by the camera.
Several attempts have been made to address this issue. For example, in PCT Pat. App. No. WO 2014,011,182 by Gharib, convergence/divergence based depth determination techniques and uses with defocusing imaging are described. The system includes two projectors emitting converging red and blue light patterns, respectively (or alternatively, a single split beam), and a camera to capture the patterns. In Chinese Pat. App. No. CN 104,050,656, an apparatus and techniques for determining object depth in images are described. The system includes an emitter to project a low-resolution optical pattern and a high-resolution optical pattern, and a sensor to detect a composite image, which is then processed to obtain the depth information of the object in a range of different depths. However, none of these references mention projecting parallel light patterns, and determining the distance between neighboring points of the dual pattern for calculating the depth of the object. A paper “Development of Real Time 3-D Measurement System Using Intensity Ratio Method” by Miyasaka et al. describes a system for calculating depth of an object using intensity ratio method. The system includes a light source and video camera, wherein two types of light patterns, flat pattern and linear pattern, are projected alternatively onto the target object. Although the reference mentions calculating ratio of intensities of two different light patterns at a given pixel, it does not mention calculating ratio of distance between neighboring points, to determine the depth of the object. Furthermore, the reference does not mention using two light sources projecting dual parallel light patterns onto the target object. In US. Pat. App. No. 20,160,288,330 by Konolige, a system and method for depth sensing are described. The system includes a light source, a computing device, and two optical sensors separated by a fixed distance, each having a first set of multiple photodetectors to capture visible light, and a second set of multiple photodetectors to capture infrared light. The depth information obtained from both the visible and infrared light images is combined to determine a depth map of surfaces or objects. However, the reference does not mention using two projectors producing parallel light patterns simultaneously onto the target. Furthermore, the reference does not mention determining the distance between the neighboring light patterns and using it for calculating the depth of the target.
Therefore, a need exists for a system and method of accurate optical dimensioning.