The present invention relates to an image sensing apparatus used with a computer and an image sensing system comprising an image sensing unit and a computer.
When a computer has a connector of the Personal Computer Memory Card International Association (PCMCIA) standard, it is common to use the computer with many kinds of card units, also having a PCMCIA standard connector, of a variety of functions, such as a facsimile card and a memory card, by connecting them to the computer. In other words, it is possible to use a card having a PCMCIA standard connector by connecting it to many kinds of computers having connectors of the PCMCIA standard.
Since it is possible to transmit data at high rate through connectors of the PCMCIA standard, it is effective to use them as an input means to a computer to input an image which is generally represented by a large amount of data.
FIG. 10 is an overall view of a conventional image sensing system consisting of an image sensing unit, having a PCMCIA standard connector, for sensing an image by using an image sensing device, such as a CCD, and a computer having a connector of the same type.
In FIG. 10, reference numeral 51 denotes an image sensing unit having a PCMCIA standard connector; 52 and 53, computers; 54 and 55, connectors of the PCMCIA standard on the computers 52 and 53, respectively; and 56 and 57, display devices, such as liquid crystal displays, of the computers 52 and 53, respectively.
An operation of the image sensing system having the aforesaid configuration is as below.
When the image sensing unit 51 is connected to the computer 52, a user designates to perform image sensing operation from the computer 52, and the image sensing operation starts by using a lens unit and a CCD in the image sensing unit 51. At this point, an image is represented by charges stored in the CCD.
In the image sensing unit 51, the sensed image (i.e., stored charges) is read from the CCD by pixel, then sequentially converted into digital data. The computer 52 receives the converted digital image data through the connector 54 as the image is converted into the digital image data. The digital image data is stored in a memory of the computer 52.
In a case where the image sensing unit 51 is connected to the computer 53, the computer 53 can receive image data in the same manner.
According to the aforesaid image sensing system, in the image sensing unit, it is necessary to read charges stored in the CCD and convert them into digital data in the possible shortest period of time in order to obtain an image of good quality.
However, the rate at which a computer can receive digital data through connectors differs from computer to computer.
Thus, if the rate to read charges from a CCD (i.e., image sensing rate) is set to a fixed value, the following inconvenience might occur, for example. Assume that the rate at which the computer 52 can receive image data is higher than the rate at which the computer 53 can receive. In this case, there is a possibility that the computer 52 can receive all the image data from the image sensing unit 51 when the image sensing unit 51 is connected to the computer 52, but the computer 53 may not be able to receive all the image data from the image sensing unit 51 when the image sensing unit 51 is connected to the computer 53.
Further, there would be the following case. Namely, although an image sensing rate is set to the rate at which a computer can receive all the image data when the computer is operated under a normal state, the computer ay occasionally miss some image data because of interruption by other processes during receiving the image data.
Accordingly, it has been necessary to set the image sensing rate much lower than the rate at which a computer can receive image data so as to prevent the computer from missing any image data in the conventional image sensing system. However, the low image sensing rate causes a drop of image quality because of greater movement of an object to be sensed during sensing it and an increase of dark current noises.
Further, it is possible to store a still image by storing image data in memories of the computers 52 and 53. Furthermore, it is also possible to use the display devices 56 and 57 of the computers 52 and 53 as finders by displaying the image received by the image sensing unit 51 on the display units 56 and 57 at real time. In this case, a user can confirm a composition of an image to be sensed and the size of an object in the image on the finder, thereby sensing a desired still image by designating to perform image sensing operation with a keyboard and a mouse of the computer when the user find a desired image on the finder.
Now, since the resolution of a liquid crystal display of a computer is generally low, the number of reproducible color of the display is often limited. Further, in order to use the displays of the computers 52 and 53 in FIG. 10 as finders, the amount of image data received by the computer 52 or 53 from the image sensing unit 51 is preferably small, although the resolution drops.
On the contrary, in a case of sensing a still image by the image sensing system shown in FIG. 10, since there is a possibility that the sensed still image will be displayed on a high-resolution display, it is preferred to sense the still image in as a high resolution as possible.
Therefore, in a case of sensing a still image by the image sensing system shown in FIG. 10, the computer 52 or 53 receives image signals sensed in a higher resolution than a resolution for displaying the image on a finder from the image sensing unit 51. For example, if the liquid crystal displays 56 and 57 are in a 256 color palette mode, image data required for displaying an image on such the displays is 8 bits per pixel. In contrast, in order to sense a full-color still image, the required image data is 16 bits per pixel, or 24 bits per pixel.
The amount of data which the computers 52 and 53 can receive through the connectors 54 and 55 in a unit time period is constant, therefore, in a case of sensing a still image, the clock speed for a timing generator which controls timing for storing image data in a memory has to be decreased so that the rate of the computers 52 and 53 for receiving t he image data from the memory is equal to or higher than the rate for storing the image data into the memory to prevent the image data from overflowing from the memory of the image sensing unit 51.
In other words, in order to fix the data amount which the computer receives from the image sensing unit through the connector in a unit time period, upon sensing a still image, clock frequency for the timing generator for controlling the CCD has to be halved or decreased to a one-third of the rate at which an image is displayed on the finder.
Further, in the aforesaid image sensing system, while sequentially displaying an image on the display device 56 or 57, an automatic exposure controller calculates and sets how long (i.e., for how many clocks) the CCD is to be exposed (the number of clocks is referred as “electronic shutter value”, hereinafter) on the basis of data on exposure so that the image is obtained at a proper exposure.
If the user designates to perform still image sensing operation when the user finds a desired image on the display device, since the clock frequency for the timing generator is decreased for sensing a still image, the still image would be over exposed with the same electronic shutter value as that for displaying the image on the finder.
The present invention is addressed to solve the aforesaid problems.
Furthermore, since the image sensing device used in the aforesaid image sensing unit by connecting to a computer is not limited by a video signal format, it can be designed freely. However, since it is easier to obtain a conventional image sensing device as well as it is possible to manufacture the image sensing unit at low cost by using a conventional image sensing device, an image sensing device designed for a video camera is often used.
Further, as an integrated circuit (IC) for a timing generator for operating the image sensing device, an IC for video camera is used. In addition, ICs for correlated double sampling, automatic gain control, signal processing, and a synchronizing signal generator used in the image sensing unit are also used in a video camera. Consequently, an image sensing unit (a camera) including above parts are operated at the video rate.
FIG. 11 is a block diagram illustrating a configuration of a conventional image sensing unit (digital camera). In FIG. 11, reference numeral 1 denotes an optical lens for forming an image of the object; 2, an iris diaphragm for controlling a quantity of incoming light of the optical image of the object passing through the optical lens 1; 3, an image sensing device for converting the image of the object formed by the optical lens 1 into electric signals, and as the image sensing device, an interlace scanning type CCD, commonly used in a movie video device, having a color filter in which complementary colors are repeatedly arranged in a fixed pattern is used.
Further, reference numeral 4 is a timing signal generator (TG) for generating timing signals necessary for operating the image sensing device 3; 5, an image sensing device operating unit for amplifying the timing signals from the TG 4 to a level with which the image sensing device 3 can be operated; 6, a correlated double sampling (CDS) circuit for removing output noises of the image sensing device 3; 7, an amplifier for amplifying output signals from the CDS circuit 6; and 8, a clumping circuit for stabilizing the zero (black) level of the amplified signal.
Reference numeral 9 denotes an analog-digital (A/D) converter for converting analog signals outputted from the clumping circuit 8 into digital signals; 10, an image signal processing circuit for processing the digitized signals; 18, a synchronizing signal generator (SSG) for generating pulses necessary for signal processes or generating synchronizing signals for dealing with video signals; 19, an interface (I/F) for outputting the signals processed by the image signal processing circuit 10 to a computer or a recording medium; 13, a luminance level detector for outputting information on an integrated value of luminance signals, generated by the image signal processing circuit 10, in a predetermined area; 14, a system controller for controlling entire processing of the camera; 15, an oscillator for generating a base clock to be reference of the entire processing; and 16, an iris diaphragm driver for driving the iris diaphragm 2 to change the aperture.
Control of luminous exposure in the camera having the configuration as above will be explained.
The control of the luminous exposure is, for example, to control the quantity of light which incidents on the image sensing device 3 by controlling the aperture of the iris diaphragm 2, thereby stabilizing the illuminance on the photosensing surface of the image sensing device. The quantity of light incidenting on the image sensing device 3 can be represented by an integrated value of luminance signals, obtained by the image signal processing circuit 10, based on the output signals in a predetermined area of the photosensing surface of the image sensing device 3. Then, the luminance level detector 13 gives the information on the luminance level to the system controller 14.
The system controller 14 compares the obtained luminance level to a reference luminance level, and if the obtained luminance level is higher than the reference luminance level, it controls the iris diaphragm 2 by instructing the iris diaphragm driver 16 to decrease the aperture of the iris diaphragm 2. Contrarily, if the obtained luminance level is lower than the reference luminance level, then it controls the iris diaphragm 3 by instructing the iris diaphragm driver 16 to increase the aperture of the iris diaphragm 3.
By performing the aforesaid feed-back operation, the luminous exposure is kept at a suitable value. Note that the iris diaphragm 2 commonly includes an IG meter having a coil and a magnet, and a stepping motor. Further, in a case where the enough quantity of light can not be obtained even if the iris diaphragm 2 is opened to its maximum, gain set in the amplifier 7 is increased.
A trend of a digital camera is to lower the price and the energy consumption. As such cameras, there are a digital camera having no iris diaphragm and a digital camera having a plurality of fixed iris diaphragm which can be manually switched. Luminous exposure in these cameras is controlled by controlling electronic shutters of image sensing devices, in other words, by controlling periods for storing charge in the image sensing devices. A method of controlling an electronic shutter will be explained below.
FIG. 12 shows a brief configuration of a general interlace scanning type CCD for a video camera. In FIG. 12, reference numeral 20 denotes photoelectric converters for converting incoming light whose wavelength is in a specific wavelength range into charges, and each photoelectric converter include a photodiode. Reference numeral 21 denotes vertical charge coupled devices (VCCDs) for transferring charges stored in each pixel in the vertical direction; 22, horizontal charge coupled device (HCCD) for transferring charges transferred via the VCCDs 21 by horizontal line; and 23, a floating diffusion amplifier for converting the charges transferred via the HCCD 23 into voltage signals and outputting them.
FIGS. 13A and 13B shows the detail of a pixel. Specifically, FIG. 13A shows a cross sectional view of a pixel, and FIG. 13B shows its potential profile.
As shown in FIGS. 13A and 13B, an anti-blooming structure and, as a function of an electronic shutter, a vertical over-flow drain is adopted. Further, the potential of the substrate is adjusted to a potential, Vsub (DC), at which an anti-blooming function properly works at a predetermined saturation charge. In addition, when a pulse of ΔVsub is applied, signal charges stored in the photodiodes are drained.
FIG. 14 is a timing chart of vertical transfer pulses ΦV1˜ΦV4 for governing timing in a period of time between just before transference of charges from the photodiodes to the VCCDs and just after the next transference of charges from the photodiodes to the VCCDs, and an electronic shutter pulse ΦVsub. The electronic shutter pulses are usually applied during the horizontal return period.
The charge storage period “te” (exposure time or shutter speed) is a period between when the last electronic shutter pulse is applied between two charge transfer pulses for transferring charges from the photodiodes to the VCCDs and the latter of the two charge transfer pulses is applied (one vertical period or one field period).
Therefore, in order to control luminous exposure by means of an electronic shutter, how many electronic shutter pulses are to be applied since a given charge transfer pulse for transferring charges from the photodiodes to the VCCDs is applied is controlled. More specifically, the luminance level obtained by the luminance level detector 12 is compared to a reference luminance level, and if the obtained luminance level is higher than the reference luminance level, then the system controller 14 instructs the TG 4 to increase the number of electronic shutter pulses to be generated (i.e., pulses applied in a period between given two charge transfer pulses for transferring charges from the photodiodes to the VCCDs, and when the number of the electronic shutter pulses increases, an exposure time is shortened, accordingly). On the contrary, if the obtained luminance level is lower than the reference luminance level, then the system controller 14 instructs the TG 4 to decrease the number of electronic shutter pulses to be generated. With the above feed-back operation, exposure is stabilized at a suitable value.
However, there are following problems, especially in luminous exposure control, in the aforesaid conventional image sensing device.
(1) If a mechanical iris diaphragm meter or a motor is mounted, the size and the weight of the camera increases as well as energy consumption increases. This is fatal to a digital camera which is supplied with electrical power from a computer.
(2) There is a limitation determined by the video rate, an exposure time is 1/60 at maximum and 1/5000 at minimum. Accordingly, it is not practical to adjust luminous exposure by controlling the exposure time. It is possible to widen an adjustable range of luminous exposure by providing a couple of fixed iris diaphragms and switching between them, however, it is still not enough. Furthermore, for sensing a low luminance object, since the exposure time can be extended up to only 1/60 second, the lowest luminosity that the camera can sense is not low enough. This is fatal since it is difficult to sense an image inside of a building.
(3) Gains set in an amplifier for amplifying output signals from the image sensing device may be increased by 6 dB˜18 dB when sensing a low luminance object, however, the S/N ratio increases, thus an obtained image would not be good.
(4) In a case of controlling a luminous exposure by using an electronic shutter, when the exposure time is to be shortened (when sensing a high luminance object), an increase or decrease of one electronic shutter pulse causes considerable change in exposure time, thereby the exposure time cannot be finely adjusted (more specifically, the electronic shutter pulse is generated during a horizontal return period so as to avoid adding noises to image signals, and applied at an interval of one horizontal period. Therefore, when shortening the exposure time, a ratio of one horizontal line period to the entire exposure period becomes high).