Many electronic imaging devices, such as smartphones or tablets with one or more cameras actively illuminate the scene to facilitate extraction of non-traditional information from the scene. One example is depth measurement to objects in a scene, or measurement of the dimensions of an object in the scene, during image capture of the scene. One such system on these devices is an active stereoscopic system that can be used to generate images of objects in 3D space where each pixel is defined as having depth values in addition to the Red-Green-Blue (RGB) values typical in a 2D system. Active stereo systems illuminate the scene to provide detectable features to be used for depth measurement which enhances the cameras performance with respect to objects containing few naturally visible features for the stereo system to match and triangulate. Other examples of systems incorporating active illumination include structured light, which illuminates the scene with a specific pattern and uses that pattern to triangulate the individually recognized projected features; and coded light which projects a time varying pattern and analyzes distortions in the pattern to infer depth. Non-depth applications include iris scanning and face recognition, both of which use invisible illumination to reduce the effects of naturally occurring shadows.
In one specific example of active stereo depth cameras, two or more cameras may be used to measure depths of an object in a scene by using triangulation. This involves measuring the angles from a common object in the scene from two or more separate cameras or sensors. When the orientation and distance between the two or more cameras is known, the depth of the common object can be calculated.
The active stereo example resolves the case of there being insufficient object points in a scene to determine a triangulation, such as an image of a plain white wall. To accommodate these scenes, the active stereo camera includes a projector, such as an infra-red laser projector, to project an intensity pattern onto the scene. The stereo cameras, or IR sensors, may then use the intensity pattern to measure the depth of the object that the intensity pattern is projected upon. Difficulties often occur, however, when the scene is outside in sunlight. The sunlight can saturate the light entering the imaging device thereby resulting in a low signal-to-noise ratio (SNR), and this may even entirely wash out the projected intensity pattern.
In other examples, such as structured light or coded light, any external sources of illumination in the wavelength being used can interfere with the operation of the system including other systems employing similar technology. The sensor often cannot discriminate between light projected by the systems camera and light projected by a second camera, or other system using the same type of illumination. Therefore, light from a second system, or second camera on the same system, can distort the perceived scene and corrupt the depth measurement, or face recognition or iris recognition or any other application where the system requires control of the scene illumination for optimum performance.