Recent years, an imaging system including a color camera that captures a color (red, green, and blue: RGB) image and a NIR camera that captures a NIR (Near Infrared) image, and a half mirror is known (for example, refer to NPL 1). In the imaging system disclosed in NPL 1, in a case where a color image is captured, a flash that emits near-infrared (NIR) rays to a subject is not used, and in a case where the NIR image is captured, the above-described flash is used.
In the imaging system disclosed in NPL 1, light from the subject is divided toward two directions by the half mirror. The light divided toward two directions is imaged by the color camera and the NIR camera, and then, an RGB image and the NIR image are obtained. The RGB image is a color image. The NIR image is one of the images captured by emitting the infrared (NIR) rays and is a monochrome image. In the imaging system disclosed in NPL 1, a composite image in which a color tone is added to the monochrome image can be obtained by composing the RGB image and the NIR image.
In addition, in an image sensor that captures the image, it is known that there are an image sensor of a global shutter system and an image sensor of a rolling shutter system. Here, in a case where the image sensor of the rolling shutter system is used, a frame rate of an output image after the composition of the RGB image and the IR image is ¼ of the frame rate in a case where only the RGB image or only the IR image is output without the composition. The reason for this is that, in the image sensor of the rolling shutter system, since an exposure start timing of a first line and the last line of image sensor elements are different from each other, for example, if the images are captured over two frames across the frames, the RGB image and the IR image are composed, and then, the output image is generated.
FIG. 13 is a timing chart illustrating a specific example of an imaging operation of a monitoring camera including the image sensor performing the imaging operation according to the rolling shutter system in a comparison example.
In FIG. 13, in a frame immediately before frame b0, an imaging parameter and a light emission condition suitable for the IR image are set (a sensor register control). The sensor register control indicates that the imaging parameter held in a memory (register) (not illustrated) is set. In frame b0, the capturing of the IR image is performed in the image sensor. Since the image sensor of the rolling shutter system is used, the capturing of the IR image is performed over frame b1.
In frame b1, the imaging parameter and the light emission condition suitable for the RGB image are alternately set (sensor register setting). In this way, the IR lighting is switched to ON from OFF and the brightness of the captured IR image changes during capturing. Therefore, since the image obtainable in the frame b1 becomes a blurry image, the IR image captured in frame b1 is discarded as Dummy (a dummy image).
In a frame b2, an IR image captured by the image sensor over the frame b1 to the frame b2 is acquired.
In a frame b3, an RGB image captured by the image sensor over the frame b2 to the frame b3 is acquired. However, the imaging parameter and the light emission condition suitable for the IR image are set. In this way, since the IR lighting is switched to OFF from ON, the brightness of the captured RGB image changes during capturing. Therefore, since the RGB image obtained in the frame b3 is blurry, the RGB image captured in the frame b3 is discarded as Dummy (a dummy image).
In a frame b4, an RGB image captured by the image sensor over the frame b3 to the frame b4 is acquired, and the IR image acquired in the frame b2 and the RGB image acquired in the frame b4 are composed, and then, a composite image is obtained. Subsequently, a similar operation is repeated.
In the imaging device (for example, the monitoring camera) in the comparison example illustrated in FIG. 13, since the exposure time for the IR image and the RGB image is one frame even if it is long, as in the frames b1, b3, b5, and b7, the captured images are discarded as dummy images. Therefore, there occur such influences that: the frame rate of the output image deteriorates, the sensitivity of the IR image and the RGB image obtained from the image sensor deteriorates, and an afterimage is generated between the frame rate of the IR image and the frame rate of the RGB image. For example, in a case where the image sensor of 60 fps (frame per second) of the rolling shutter system is used, the exposure time for the RGB image is one frame (that is, 1/60 second) and the frame rate of the output image is decreased to 15 fps.
Furthermore, as illustrated in FIG. 14, when the image sensor of the rolling shutter system is used, since the exposure start timings of the image sensor element at the upper part and the lower part are different from each other, in a case where the subject is a moving object, a phenomenon called a dynamic distortion (a focal plane distortion) is generated.
The present disclosure has an object to provide an imaging device and an imaging method that can decrease the distortion of the output image and suppress the deterioration of the frame rate of the output image with a simple configuration even in a case of using the image sensor of the rolling shutter system.