Current apparatuses for image recording like still cameras, video cameras, cameras incorporated into mobile devices like laptops, cellular phones or apparatuses for scientific image recording use electronic sensors to record an image. Commonly used electronic sensors for image recording or capturing are CCD (Charge Coupled Device) sensors and CMOS (Complementary Metal Oxide Semiconductor) sensors. Such sensors typically include a plurality of sensor elements, sometimes referred to as pixels (“picture elements”), which are arranged in a geometrical pattern. A commonly used sensor element arrangement for example in digital cameras is a rectangular array, although other arrangements, for example circular arrays, are equally used. The number of such sensor elements each of which records at least the brightness of a corresponding picture part determines the resolution of the recorded image. For recording color information, sensors exists where a single sensor element is able to distinguish between colors, while in other types of image sensors color filters for the primary colors red, green and blue are arranged in front of the sensor, for example in a so-called Bayer-pattern, such that each sensor element records a specific color and the overall color may be determined from adjacent picture elements recording different colors.
At least with some of these sensors, the sensor elements are not read out simultaneously, but group-wise in a staggered manner. For example, CMOS-type sensors are frequently read out row be row (or column by column) in a case where the sensor elements are arranged in a matrix-like array. For example, assuming that the rows are numbered consecutively from a top side to a bottom side of the sensor, the start of readout of the second row may be delayed compared with the start of the readout of the first row, the start of readout of the third row is delayed compared to the start of readout of the second row, etc.
While for recording images of non-moving objects this generally does not constitute a problem, when recording images of moving objects an effect sometimes referred to as “rolling shutter” may occur. In particular, when groups of sensor elements like rows of a CMOS sensor are not read out simultaneously, but delayed compared with each other, the position a moving object is recorded at in a first group of sensor elements may sometimes not coincide with the position the object is recorded at in a second group of sensor elements. In this manner, for example a straight vertical line moving to the left or to the right may be recorded as a line tilted to the right or to the left, respectively.
A conventional approach to at least mitigate this problem is the use of a shutter which exposes the complete image sensor to illumination only for a short predetermined period of time, the period of time itself being for example dependent on the overall illumination of a scene to be recorded. However, such shutters constitute an additional mechanical part needed for image recording which, on the one hand, adds to the manufacturing costs and, on the other hand, in applications where only little space is available, for example when a digital camera function is integrated in another mobile electronic device like a cellular phone, it is desirable not to use a shutter for space, weight, and/or design reasons.
Therefore, there is continuing need to address the problem of rolling shutter.