Typically, to capture a color image using a single charge coupled device (“CCD”) or CMOS sensor, the sensor can record the image into an array of pixels to represent the color image. Each of the pixels has a digital pixel value (i.e., a number associated with the light intensity received by that respective pixel, e.g., a luma value for each pixel). A color filter array (“CFA”) is typically used for measuring only light intensity from a single color component on any particular pixel.
FIG. 1 illustrates an image capture device used for capturing an image. The image capture device 10 (e.g., a mobile phone, a digital camera, video camera, a computing tablet, laptop, or other device) can capture luma values for an image 12. The luma values are stored on the image capture device for reproduction and viewing of the image by processing the luma values back into a displayable image.
However, a camera module of the image capture device 10 can suffer from a phenomenon called lens shading. This is caused by properties of the lens of an image sensor of the camera module. The effect of lens shading can be more pronounced depending on the angle of the lens at various areas of the lens. Furthermore, different colors (e.g., green, blue, or red) can have different responses to the curvature of the lens, which results distortion in the captured image.
FIG. 2 illustrates various areas of a captured image that are affected by a lens shading effect of an image capture device. If a lens 20 is used to capture an image, the image can be partitioned into several areas 22-26. In each of the areas 22-26, the luma values captured by the image capture device can vary for a uniform light incident on the image capture device due to the curvature of the lens and other minor variations of the image capture device.
Typically, lens shading is corrected by using a uniform lens shading correction mesh, where each point of the mesh has a single luma gain to be used for lens shading correction. FIG. 3 illustrates a lens-shading correction mesh of the prior art. A lens shading correction mesh 30 can have several lens coordinates arranged in a mesh 30. The lens coordinates are equidistant from each other and do not account for the curvature of the respective lens of the image capture device. Furthermore, each point of the mesh only has a single luma gain, regardless of the luminosity of the image to be captured. The single luma gain for each of the points does not account for nonlinearity of the lens shading phenomenon.
The methods of the current art for lens shading correction can result in a distorted captured image since (1) the current art does not take into account the curvature of the lens when creating a lens-shading mesh and/or (2) the current art does not take into account the nonlinearity of the lens shading phenomenon. Therefore, there exists a need for new methods and systems for calibrating an image capture device that accounts for the curvature of the lens and accounts for nonlinearity of the lens shading phenomenon to generate true captured images.