Determining pixel correspondence and disparity between images is a fundamental component of a variety of imaging technologies. For example, in camera array implementations, 3D information may be extracted from multiple 2D images of a scene obtained from different vantage points. Determining disparity between 2D images may include, for a given pixel in one image (e.g., a reference image) from a first vantage point, finding its location in another image (e.g., a target image) from a second vantage point. The shift in position of the pixel between the images is disparity. Such a disparity may be found for all pixels to generate a disparity map, which may be used to generate a depth map of the scene.
Current disparity search techniques used to determine the correct shift per pixel may search a range of possible shift values (e.g., d=[0, . . . , max D], where d is disparity and max D is a maximum disparity) to determine a best match. For example, a disparity search may compare a window around the given pixel in the reference image and search windows in the target image around each pixel in the target image in the range of possible shift values to find a best match search window and the associated best match pixel with the shift between them being the disparity value for the pixel.
It may be advantageous to provide low computational complexity, low memory usage, and high accuracy image disparity searching. It is with respect to these and other considerations that the present improvements are needed.