Modern image sensors standardly function according to the following configuration:    1. An optical mechanism produces a light image on a flat sensor (also called an image sensor or imager).    2. The image sensor samples this image spatially and temporally.    3. The temporal sampling is determined by the frame rate.
Here, the standard camera components fulfill the following functions:                The optical mechanism acts as a spatial low-pass filter.        Today, an image sensor is made up of a few million individual light sensors, each light sensor representing a spatial sampling point.        The exposure time acts as a temporal low-pass filter.        For a sampling free of aliasing, the low-pass filter is to be matched to the sampling frequency in such a way that the sampling frequency is twice as great as the cutoff frequency of the low-pass filter (Shannon-Nyquist sampling theorem). In addition, the signal should always run through the low-pass filter before the sampling.        
If this does not happen, the sampled signal will contain aliasing artifacts. The spatial aliasing effects are standardly visible at edges having high contrast (as color fringing), point-like objects (by disappearing and reappearing), and objects having a uniform periodic pattern (as a so-called moiré). In the temporal signal, aliasing effects show up as rolling shutter effects (a straight object is imaged as curved), effects such as the “wagon wheel effect” (from a particular speed, a wheel appears to rotate backwards), and problems in the recording of pulsed light sources (LED traffic signals, rear lights of vehicles). Such effects also occur in human vision, and thus show that even here temporal aliasing effects are not completely prevented. For applications in the area of machine vision, the optical flux is an important auxiliary quantity. The quality thereof is significantly improved by a temporally correctly sampled signal.