1. Field of the Invention
The present invention relates generally to image sensors such as a CMOS (complementary metal oxide semiconductor) image sensor, and more particularly, to driving pixels of an image sensor with reduced area and high image quality.
2. Description of the Related Art
In general, a solid-state image-sensing device, such as a CMOS (complementary metal oxide semiconductor) image sensor (CIS) for example, is a semiconductor device that captures images by sensing light. The image sensor includes an array of hundreds of thousands to millions of pixels that convert light of an image into electrical signals. In addition, an analog-to-digital converter converts such electrical signals that are analog signals into digital signals that are then stored in data storage units.
Digital cameras, camcorders, and the like currently commercially available are desired to capture both still and moving images. Thus, an image sensor with millions of pixels, which is suitable for capturing a still image, is also desired to be used for capturing a moving image.
As technology advances, the number of pixels in the semiconductor image sensor is likely to increase. Conventionally, capturing a still image or a moving image using one high resolution semiconductor image sensor such as a CMOS image sensor is performed in two ways.
According to one method, photocurrents from all pixels of the CMOS image sensor are measured to capture a still image. For capturing a moving image, photocurrents from selected pixels at predetermined intervals in the array of pixels are measured to capture the moving image. However, ignoring data from the non-selected pixels degrades the image quality of the moving image.
FIG. 1 is a block diagram of a conventional CIS type solid-state image-sensing device 100. Referring to FIG. 1, the conventional CIS type solid-state image-sensing device 100 includes an active pixel sensor (APS) array 110 having a matrix of pixels, a row driver 120, and an analog-to-digital conversion unit 130.
The row driver 120 receives a control signal from a row decoder (not shown), and the analog-to-digital conversion unit 130 receives a control signal from a column decoder (not shown). The solid-state image-sensing device 100 further includes a controller (not shown) that generates timing control signals and addressing signals for outputting selected and sensed video signals from each pixel of the APS array.
In the solid-state image-sensing device 100, a respective color filter is disposed over each pixel of the APS array 110 such that only light of a specific color is input to each pixel. To construct color signals, at least three kinds of color filters are arranged on the APS array 110. A general color filter array has a Bayer color pattern in which red and green color filters are alternately disposed along one row, and in which green and blue color filters are alternately disposed along the next row. Here, green which is closely related to a luminance signal is arranged in all rows with red and blue being alternately arranged in the rows to improve luminance resolution. A CIS having more than one million pixels is used in a digital still camera for improved resolution.
In the CIS type solid-state image-sensing device 100, the APS array 110 senses light using photodiodes and converts the sensed light into electric signals to generate image signals. The image signals output from the APS array 110 include red (R), green (G) and blue (B) analog signals. The analog-to-digital conversion unit 130 receives and converts such analog signals from the APS array 110 into digital signals.
When the CIS solid-state image-sensing device captures a still image, video signals from all pixels of the APS array 110 are output. In the sub-sampling mode, however, vertical resolution is reduced and video signals from a subset of the pixels of the APS array 110 are output and processed.
For example, a CIS type solid-state image-sensing device having an APS array with super extended graphic adapter (SXGA) resolution outputs SXGA-grade image signals for capturing a still image. However, the solid-state image-sensing device outputs video graphic adapter (VGA)-grade video signals in sub-sampling mode operations including moving picture display, preview, and automatic focus.
In the sub-sampling mode of operation in the conventional CIS type solid-state image-sensing device 100, only image signals of selected rows and columns are output to the analog-digital conversion unit 130 to reduce resolution. Thus, some image data is not used in the sub-sampling mode which causes aliasing noise with oblique lines shown as zigzag on a display.
To remove such aliasing noise, a method of averaging image signals in a predetermined range has been proposed. For example, image signals are analog-averaged in a predetermined range before being output to the analog-to-digital conversion unit 130. Alternatively, digital signals from the analog-to-digital conversion unit 130 are averaged. However, such digital averaging requires a large-capacity memory with an increase of chip area and power consumption. Furthermore, for analog-averaging image signals sensed by pixels of the APS array, a single column requires two large capacitors for averaging reset signals and image signals resulting in increase in chip area. Such solid-state image-sensing devices with increased chip area and power consumption may not be amenable for small-size mobile devices.
Alternatively for reducing resolution, photocurrents of adjacent pixels are combined to capture a moving image such that data from a substantial portion of the APS array are not ignored for improving image quality. Nevertheless, a CMOS image sensor using primary color filters cannot use such a technique. Furthermore, even when the CIS has a shared floating diffusion (FD) pixel structure, since adjacent pixels have different color filters for a Bayer color pattern, photocurrent signals from adjacent pixels cannot be combined to represent a particular color.
FIG. 2 is a circuit diagram of the conventional APS array 110 of FIG. 1. The APS array 110 includes a plurality of pixels 101, 102, 103, 104, 105, 106, 107, and 108 and a plurality of signal converters 111, 112, 113, and 114. Each of the pixels 101, 102, 103, 104, 105, 106, 107, and 108 is arranged along rows and columns forming a matrix of pixels and converts a received light of a respective color into a corresponding photocurrent indicating the intensity of such received light. Each of the pixels 101, 102, 103, 104, 105, 106, 107, and 108 is comprised of a respective photodiode PD and a respective transfer MOSFET between the respective photodiode and one of the signal converters 111, 112, 113, and 114.
Each of the photodiodes PD is for receiving a respective color defined by a color filter disposed thereon. Photodiodes PD with a label R1 or R2 are for receiving red colored light, photodiodes PD with a label B1 or B2 are for receiving blue colored light, and photodiodes PD with a label Ga1, Ga2, Gb1, or Gb2 are for receiving green colored light. Each of the signal converters 111, 112, 113, and 114 converts photocurrent output from any of the pixels 101, 102, 103, 104, 105, 106, 107, and 108 coupled thereto into an output voltage Vout.
APS array 110 of FIG. 2 has a shared FD pixel structure in which each of the signal converters 111, 112, 113, and 114 is coupled to a corresponding pair of two adjacent pixels along a column of the array of pixels to reduce the area of the APS array 110. For capturing a still image, the two adjacent pixels connected to a signal converter separately and sequentially output a respective photocurrent to the signal converter.
APS array 110 has a Bayer color pattern with the pixels being for receiving alternating colors along a column or a row. Thus, the pixels 101, 102, 103, and 104 in the first column are for receiving lights of alternating colors of red, green, red, and green, respectively. Similarly, the pixels 105, 106, 107, and 108 in the second column are for receiving lights of alternating colors of green, blue, green, and blue, respectively.
Accordingly, in the APS array 110 of FIG. 2, each signal converter 111, 112, 113, or 114 is connected to two adjacent pixels with different color filters. Thus, such a signal converter cannot combine the photocurrent signals from such adjacent pixels for simplified signal processing. That is, for capturing the moving image, the CIS type solid-state image-sensing device 100 would process photocurrent data from a portion of the array of pixels selected at predetermined intervals or would separately measure the photocurrents for all pixels and perform an averaging through image signal processing (ISP).
However, capturing the moving image from photocurrents of a portion of the array of pixels results in low image quality. Alternatively, capturing the moving image by separately measuring the photocurrents for all pixels and averaging through ISP requires high frequency operation and high power consumption. Nevertheless, a shared FD pixel structure is desired for reducing the area of the solid-state image-sensing device 100.
Thus, a mechanism for driving the pixels of an image sensor having a shared FD pixel structure with high image quality is desired.