Currently, a CMOS image sensor (referred to hereinafter as a CMOS) has become mainstream as an image sensor mounted in a photographing apparatus and the like.
The CMOS cannot simultaneously expose all pixels to light and simultaneously read pixel signals obtained as a result of exposure owing to the structure thereof. Accordingly, a so-called “rolling shutter” sequentially scanning pixels line by line to perform exposure and pixel signal reading is performed. However, there is a CMOS that has a so-called “global shutter function” which may synchronize exposure timing of all pixels.
FIG. 1 is an explanatory diagram of a rolling shutter according to a CMOS and illustrates a 0-th frame f0, a first frame f1, a second frame f2, . . . which are consecutive and constitute a moving image. Each frame includes pixels corresponding to 0-th to (h−1)-th lines.
As described above, exposure and reading are performed line by line according to the rolling shutter. For example, when a frame rate of a photographed moving image is 30 fps, a read timing difference between vertically adjacent lines is 1/(30×h) seconds. Although 1/(30×h) seconds are short time, this time may not be ignored when accumulated. Considering two lines of the same frame, which are separated from each other to some degree, for example, it is difficult to regard read timings of the two lines as the same time and the frame is distorted due to a difference between the read timings.
Distortion generated in a frame due to the rolling shutter will be described.
FIG. 2 illustrates a scene (structures such as a house and a building) corresponding to a subject of video shooting. It is assumed that a photographing range of the subject, defined by a broken line, is photographed as a moving image, for example, using a video camera fixed to a vehicle in a stop state with its engine started. Here, the video camera is assumed to micro-vibrate horizontally due to vibration of the engine.
FIG. 3 illustrates one frame of the moving image photographed in a state in which the vehicle is stopped with the video camera micro-vibrating. As illustrated in the figure, horizontal vibration (referred to hereinafter as distorted high-frequency components) caused by micro-vibration of the video camera is generated in the video frame, and thus it is difficult to view the moving image in this state.
Accordingly, methods of correcting such distorted high-frequency components have been established. Specifically, a pixel matching process for specifying positions of the same portion of the captured subject between two frames temporally adjacent to each other (e.g., temporally adjacent (n−1)-th frame and n-th frame) is performed to obtain a corresponding relation between pixels of the two frames and distorted high-frequency components generated in the frames are corrected on the basis of the corresponding relation.
FIG. 4 illustrates a result of distortion correction performed for the frame having the distorted high-frequency components, illustrated in FIG. 3. If a correct corresponding relation between pixels of the two frames may be obtained, the distorted high-frequency components may be corrected as illustrated in FIG. 4. However, as seen from comparison between a viewing angle of the frame before distortion correction illustrated in FIG. 3 and a viewing angle of the frame after distortion correction illustrated in FIG. 4, the viewing angle of the frame after distortion correction is narrowed.
A case in which photographing is performed in a direction perpendicular to a traveling direction of the traveling vehicle (right lateral direction) using the video camera fixed to the vehicle, as mentioned above, is considered.
FIG. 5 illustrates shift of a photographing range when a moving image is photographed during traveling of the vehicle. That is, when the vehicle travels to the right in the figure, the photographing range of the video camera is also moved to the right in the figure, as indicated by broken lines.
FIG. 6 illustrates one frame of the moving image, captured during traveling of the vehicle. As illustrated in the figure, distorted high-frequency components are generated in the video frame as in the case of FIG. 3. Furthermore, distortion (referred to hereinafter as distorted low-frequency components) that a house and a building standing upright are photographed as tilted house and building is generated. This is because there is a difference between timing of photographing the 0-th line of the frame and timing of photographing an m-th line (m being an integer in the range of 1 to h−1) of the frame.
It is difficult to view the moving image having the distorted high-frequency components and the distorted low-frequency components as illustrated in FIG. 6. Accordingly, various methods for such distortion correction have been provided. Specifically, a pixel matching process is performed between two temporally adjacent frames and distortions (distorted high-frequency components and distorted low-frequency components) are collectively corrected on the basis of a corresponding relation between pixels of the two frames, like the aforementioned distorted high-frequency component correction.
FIG. 7 illustrates a result of distortion correction performed for the frame having distorted high-frequency components and distorted low-frequency components, illustrated in FIG. 6. If a correct corresponding relation between pixels of the two frames may be obtained, the distorted high-frequency components and the distorted low-frequency components may be corrected at one time as illustrated in FIG. 7.
However, in the case of the video frame captured during traveling of the vehicle, the viewing angle of the frame after correction of the distorted high-frequency components and the distorted low-frequency components becomes narrower than that of the frame after correction of the distorted high-frequency components, illustrated in FIG. 4, because the photographing range changes as illustrated in FIG. 5.
With respect to the pixel matching process for the moving image photographed in a stop state, illustrated in FIG. 3, a search range may be narrowed since the photographing range of each frame is changed by only a degree corresponding to micro-vibration. Accordingly, it may be possible to relatively easily and accurately detect corresponding pixel positions between two frames.
With respect to the pixel matching process for the moving image photographed in a traveling state, illustrated in FIG. 6, it is necessary to widen a search range because frames have different photographing ranges. This increases the amount of computations and a possibility that corresponding pixel positions between two frames are erroneously detected. If corresponding pixel positions between two frames are erroneously detected and distortion correction is performed on the basis of the corresponding pixel positions, a problem that the moving image becomes a state that is more difficult to view than the state before correction may occur.
Accordingly, as a method for solving the problem, there has been proposed a method of performing distortion correction only when the pixel matching process has high accuracy and performing no distortion correction when the pixel matching process has low accuracy, or decreasing a distortion adjustment value when consecutive frames, captured in a traveling state, are moved in the same direction (refer to Patent Literature 1, for example).