Image sensors are commonly used for the acquisition of images, such as by still photography, videography, and the like. Image sensors capture images by exposing a plurality of pixels to light and then reading the individual pixels. For a given image being captured, every pixel of an image sensor will generally have the same integration duration (often referred to as the integration time). Integration duration represents the duration of time between pixel sensor reset and sampling and is analogous to exposure duration in film photography. Despite pixels having common integration duration, reading or capturing image data from the sensor is typically performed sequentially on a pixel-by-pixel basis, typically in serial row-column pixel read pattern operating at a frequency, which may be referred to as the pixel clock.
FIG. 2A shows a schematic illustration of an example image sensor 20. Image sensor 20 has a serial row-column pixel readout pattern, where pixels are read from sensor 20 along each individual row 21 before jumping (along the diagonals illustrated using dotted lines) from the end of a given row 21 to the beginning of a subsequent row 21. That is, in the case of the FIG. 2A illustration, scanning advances in a particular direction along row axis 23 for each given row 21 before proceeding to each subsequent row 21 along column axis 25.
The two most common types of image sensors are referred to as “global shutter” sensors and “rolling shutter” sensors. Image sensor hardware (such as CMOS image sensor hardware) is typically configured for use in either a global shutter sensor configuration or a rolling shutter sensor configuration. A global shutter image sensor will expose all of the active pixels of the sensor (typically an array of rows and columns of pixels) substantially simultaneously and for substantially the same integration duration. That is, every active pixel of the global shutter image sensor begins its integration duration at substantially the same time and ends its integration duration at substantially the same time. After all of the active pixels have undergone integration, the individual pixels of the global shutter image sensor are flushed to suitable “hold circuitry” to prepare the sensor for a subsequent integration. Image data is then read (typically sequentially) from the hold circuitry. Once every pixel of a first image has been flushed from the sensor to the hold circuity, the global shutter image sensor may capture another image by again undergoing simultaneous integration of all of its pixels.
In contrast to a global shutter image sensor, a rolling shutter image sensor integrates different portions of the image sensor (e.g. individual pixels or groups of pixels, such as one or more lines or columns of pixels) at different times (i.e. with different integration start and stop times). As soon as a particular portion of the rolling shutter image sensor is integrated, the pixels in that portion of the sensor may be sequentially read. Once the pixels in that particular portion of the rolling shutter image sensor are read, they may be reset and integration of those pixels may begin again immediately. When compared to the global shutter image sensor, the rolling shutter image sensor dispenses with the need for hold circuitry required for the global shutter image sensor.
In addition to the need for such hold circuitry, another drawback associated with global shutter sensors occurs in the context of capturing video data or other circumstances, where it may be desirable to capture multiple successive images (e.g. video image frames) at relatively high rates (e.g. 24, 30 or 60 images/frames per second). To achieve framerates and light sensitivity comparable to those of rolling shutter sensors, global shutter sensors typically require pixel dimensions which are large in comparison to those of rolling shutter image sensors (to allow contact with more photons during an integration interval) and may consequently require physically larger and more expensive optics.
While there can be advantages (such as those discussed above) associated with using a rolling shutter sensor over a global shutter sensor, rolling shutter image sensors can introduce artifacts into images. For example, if a rolling shutter image sensor undergoes rapid translational or rotational movement while reading image data (or, conversely, the objects in the image move rapidly with respect to the sensor), resultant images may be skewed and/or otherwise distorted.
FIGS. 2B and 2C respectively schematically show example images 22B and 22C captured by an example global-shutter sensor 20B and an example rolling-shutter sensor 20C. Sensors 20B and 20C are moving in movement direction 24 relative to the object being imaged (in the illustrated example, a tree). Global-shutter sensor 20B captures image 22B by having each pixel undergo integration at the same time (i.e. with the same integration start and stop times). After flushing the captured image to corresponding hold circuitry and a sequential readout of the pixels in accordance with the readout pattern of FIG. 2A, image 22B is substantially free of skew or other distortions.
On the other hand, the FIG. 2C rolling-shutter sensor 20C captures image 22C by having portions of sensor 20C undergo integration with different start and stop times. In the particular case of the FIG. 2C example, individual pixels of sensor 20C undergo integration in accordance with a pattern similar to the pixel readout pattern of FIG. 2A (i.e. along each row 21 from left to right and then jumping from the end of each row 21 to the beginning of a subsequent row 21, as illustrated schematically by the diagonal dashed lines in FIG. 2C). Individual pixels of sensor 20C may then be read out using the same readout pattern. The resultant image 22C shows that later-integrated pixels in image 22C (i.e. pixels having integration and readout times closer to those of the last pixel 28 to be integrated and read) may have image data that is translated relative to the image data provided by earlier-integrated pixels in image 22C (i.e. pixels having integration and readout times closer to those of the first pixel 26 to be integrated and read). The result is an apparent skew of image 22C relative to image 22B.
By way of non-limiting demonstrative example only, suppose sensor 20C is a non-limiting example rolling-shutter WUXGA image sensor with total pixel dimensions of 1980×1300 (including blanking zones) or active pixel dimensions of 1920×1200 (without blanking zones). If sensor 20C captures video data with a frame rate of 30 frames per second (FPS), the minimum pixel clock frequency would be ˜77.22 MHz (i.e. 30 frames per second*(1980*1300 pixels per frame)). If the integration time was 15 ms, then the image data corresponding to first pixel 26 would be captured (i.e. pixel 26 undergoes integration) between t=0 and 15 ms, and image data corresponding to last pixel 28 would be captured (i.e. pixel 28 undergoes integration) between 30.77 ms and 45.77 ms. This is a substantial delay, particularly if the sensor and/or any objects being imaged are moving.
There is a general desire for systems and methods which reduce or ameliorate at least some of the artifacting associated with rolling-shutter image sensors. The Applicant has discovered that the artifacting exhibited by rolling-shutter image sensors may be exacerbated in certain applications (described below), where images from a plurality of individual rolling shutter image sensors are stitched together or otherwise combined to provide a combined image. There is a general desire for systems and methods which reduce or ameliorate at least some of the artifacting associated with combining the images from a plurality of individual rolling shutter image sensors in such applications.
The foregoing examples of the related art and limitations related thereto are intended to be illustrative and not exclusive. Other limitations of the related art will become apparent to those of skill in the art upon a reading of the specification and a study of the drawings.