FIG. 1 illustrates an image pick-up device 102 such as a camera system that includes an image sensor 104. The image sensor 104 generates pixel data for an image of an object 106 that is projected through an objective lens 108 onto the image sensor 104. For example, the image sensor 104 may be a CIS (CMOS image sensor) commonly used in hand-held devices such as cell phones and PDA's (personal digital assistants).
A signal processor 110 manipulates the pixel data from the image sensor 104 for showing the image of the object 106 on a display 112, or for further processing by an image recognition system 114, or for sending the image via a transmission system 116 such that the image is shown on a remote display 118. Referring to FIGS. 1 and 2, the image sensor 104 generates pixel data 120 according to a Bayer filter array overlying the image sensor 104.
With the Bayer filter array, the image sensor 104 generates an intensity signal of a respective color at each pixel location. A square labeled with an “R” is for a pixel location on the image sensor 104 that generates an intensity signal of the red color component. Similarly, a square labeled with a “G” is for a pixel location on the image sensor 104 that generates an intensity signal of the green color component. Further, a square labeled with a “B” is for a pixel location on the image sensor 104 that generates an intensity signal of the blue color component.
An interpolation algorithm is then used by the signal processor 110 to determine the full set of intensity signals of the respective interpolated RGB color components for each of the pixel locations. The interpolation algorithm uses the pixel data of the Bayer color filter array 120 for such a determination.
Such an interpolation algorithm is known to one of ordinary skill in the art as disclosed in U.S. Pat. No. 5,382,976, U.S. Pat. No. 5,506,619, or U.S. Pat. No. 6,091,862. For determining the interpolated color components R′, G′, and B′ at a particular pixel location 124 with such an interpolation algorithm, a region of pixel data 126 surrounding that pixel location 124 is used as illustrated in FIG. 3.
Temporal noise affects the quality of the image of the object as detected and generated by the image pick-up device 102. Temporal noise is the variation in the output from the image sensor 104 even under uniform illumination onto the image sensor 104. Such temporal noise may arise from shot noise and 1/f noise at the photo-diodes of the image sensor 104, from thermal noise at the transistors and other circuit components used within the image sensor 104, or from quantization error of an A/D (analog to digital) converter used within the image sensor 104.
Such temporal noise increases with brightness of the image. However, the detrimental effect of the temporal noise on the image is greater at lower illumination because the SNR (signal to noise ratio) decreases with lower illumination. In fact, temporal noise sets a limit on the dynamic range of the image sensor 104 under dark conditions.
FIG. 4 illustrates a prior art process for reducing the effect of such temporal noise. The pixel data 120 is generated with the Bayer color filter array at the image sensor 104. The signal processor 110 interpolates such pixel data 120 to generate the respective interpolated RGB color components 122A, 122B, and 122C that are stored within a frame memory device 122 of the prior art.
In the prior art, after the interpolated RGB color components 122A, 122B, and 122C for an n×n array of pixel locations are generated and stored in the frame memory device 122, a noise reducing block 132 uses such interpolated RGB color components for reducing the deleterious effects of the temporal noise. FIG. 4 shows the noise reducing block 132 using the 3×3 arrays of the interpolated RGB color components 122A, 122B, and 122C. However, other prior art noise reducing processes may also use 5×5, 7×7, or other n×n arrays of the interpolated RGB color components.
In any case for the prior art noise reducing process, the capacity of the frame memory device 122 is sufficient to store the n×n arrays of interpolated RGB color components used by the noise reducing block 132. However, such a relatively large capacity of the frame memory device 122 is disadvantageous when the camera system 102 is incorporated as part of a hand-held device such as a cell phone or a PDA for example. Thus, elimination of the frame memory device 122 is desired for a smaller device size, lower power dissipation, and lower cost especially when the camera system 102 is incorporated into a hand-held device.