As a solid-state imaging device (image sensor) using a photoelectric conversion element which detects light and generates a charge, a CCD (charge coupled device) image sensor and CMOS (complementary metal oxide semiconductor) image sensor have been put to practical use. CCD image sensors and CMOS image sensors are widely applied as portions of digital cameras, video cameras, monitoring cameras, medical endoscopes, personal computers (PC), mobile phones and other portable terminal devices (mobile devices), and other various types of electronic apparatuses.
A CCD image sensor and CMOS image sensor use photodiodes for the photoelectric conversion elements, but differ in methods of transferring photoelectrically converted signal charges. A CCD image sensor transfers signal charges to an output part by a vertical transfer part (vertical CCD, VCCD) and horizontal transfer part (horizontal CCD, HCCD) then converts them to electrical signals which it then amplifies. Contrary to this, a CMOS image sensor amplifies charges converted for each pixel including a photodiode and outputs them as readout signals.
Below, the basic configurations of a CCD image sensor and CMOS image sensor will be explained.
FIG. 1 is a view showing the basic configuration of an interline transfer (IT) type CCD image sensor.
An IT (interline transfer) type CCD image sensor 1 basically includes a photosensitive part 2, horizontal transfer part (horizontal CCD) 3, and output part 4. The photosensitive part 2 has a plurality of pixel portions 21 which are arranged in a matrix and convert incident light to signal charges having charge amounts in accordance with the light quantity thereof and vertical transfer parts (vertical CCD) 22 as shielded charge transfer parts which vertically transfer the signal charges of the plurality of pixel portions 21 in unit of columns. The horizontal CCD 3 horizontally transfers one line's worth of the signal charges which are shifted from the plurality of vertical CCDs 22 in order in a horizontal scanning period. The output part 4 includes a charge detection-use floating diffusion layer, that is, a “floating diffusion (FD)”, for converting the transferred signal charges to signal voltages and outputs the signals obtained in the FD to a not shown signal processing system.
In this IT type CCD image sensor 1, the vertical CCD functions as an analog memory, repeats a line shift and a horizontal transfer by the horizontal CCD 3, and outputs the signals (frame signals) of all pixels in order from the output part 4.
This IT type CCD image sensor 1 has a structure enabling progressive reading (progressive scanning), but transfers the signal charges by the horizontal CCD 3, so high speed transfer is difficult.
FIG. 2 is a view showing the basic configuration of a frame interline transfer (FIT) type CCD image sensor.
The FIT (frame interline transfer) type CCD image sensor 1A is configured with shielded charge storage parts (storage parts) 5 arranged between the output stages of the vertical CCD 22 of the photosensitive part 2 in the IT type CCD image sensor 1 and the horizontal CCD 3. In the FIT type CCD image sensor 1A, all signal charges are transferred at the same time from the vertical CCD 22 of the photosensitive part 2 which received the signal charges (bundle) from the pixel portions 21 to the completely shielded storage parts 5 by high speed frame transfer.
In this way, in the FIT type CCD image sensor 1A, the signal charges read out from the pixel portions 21 in the photosensitive part 2 are transferred at the same time to the storage parts 5 by the vertical CCD 22. Therefore, compared with the IT type CCD image sensor 1 in FIG. 1, higher speed transfer is possible. However, an FIT type CCD image sensor 1A forms storage parts 5, therefore the chip area becomes about two times larger than an IT type CCD image sensor.
Note that, the CCD image sensor explained above features the possibility of global shutter reading simultaneously starting storage of photocharges for all pixels.
FIG. 3 is a view showing the basic configuration of a CMOS image sensor.
A CMOS image sensor 1B basically includes a photosensitive part comprised of a pixel array part 6, a row decoder (or row scanning circuit) 7, a column decoder (or horizontal scanning circuit) 8, an output part (output amplifier) 9, and column switches CSW. Further, in FIG. 3, LSL indicates row scanning lines, LSG indicates signal reading lines, and LTR indicates a transfer line.
In the CMOS image sensor 1B, the pixel array part is configured arranging a plurality of pixels including photodiodes in a matrix. In the CMOS image sensor 1B, the pixels PXL in the pixel array part 6 are controlled by each row by row control signals (pulse signals) supplied from the row decoder 7. A signal which is output from a pixel PXL to the output signal line LSG is transmitted through a column switch CSW to the transfer line LTR by the column scan by the column decoder 8 and is output to the outside by the output part 9.
This CMOS image sensor 1B is structured so that high speed transfer of signals is possible, but global shutter reading cannot be carried out.
In this way, the CMOS image sensor basically is structured so that global shutter reading cannot be carried out, but a CMOS image sensor that employs a multilayer structure and enables global shutter reading has been proposed (see for example NPLT 1).
FIG. 4 is a view showing an example of the configuration of a CMOS image sensor employing a stacked architecture.
A CMOS image sensor 1C in FIG. 4 employs a stacked architecture in which a first substrate 11 and a second substrate 12 sandwich a shield layer 13. On the first substrate 11, a photodiode (photoelectric conversion element) array part 6-1 and a part 7-1 of the row scanning circuit 7 are formed. Further, on the second substrate 12, a storage node array 6-2, the remaining part 7-2 of the row scanning circuit 7, column buffers CBUF, a horizontal scan circuit (column decoder) 8, output part 9, etc. are formed.
The characteristic feature of this CMOS image sensor 1C resides in elimination of the defect of general CMOS image sensors, i.e., the defect of the inability of global shutter reading.