The quality of an image captured by a color imaging system primarily depends on three factors—sensor spectral sensitivity, illumination and object scene. Illumination is important to be known. However, the spectral sensitivity characteristics are critical to the success of imaging applications, and are necessary for the optimal design of the imaging system under practical constraints. Ultimately, the human visual system subjectively judges image quality.
A digital still camera (DSC) or video camera (camcorder) has a sensor that is covered by a color filter array (CFA) to create pixel locations. A DSC typically uses red, green and blue (RGB) filters to create their image. Most current camcorders typically use cyan, magenta, yellow and green (CMYG) filters for the same purpose.
A conventional sensor is a charge-coupled device (CCD) or complimentary metal oxide semiconductor (CMOS). An imaging system focuses a scene onto the sensor and electrical signals are generated that correspond to the scene colors that get passed through the colored filters. Electronic circuits amplify and condition these electrical signals for each pixel location and then digitize them. Algorithms in the camera then process these digital signals and perform a number of operations needed to convert the raw digital signals into a pleasing color image that can be shown on a color display or sent to a color printer.
Each color camera has a unique sensor, CFA, and analog electronics system. The sensor and CFA have part-to-part variations. Accordingly, the electronic system needs to be calibrated for each camera. The goal is to make a “real world” scene captured with different cameras look the same when rendered on a display device. In order to calibrate an individual camera, the properties of the individual camera's primary color channels (CMYG for a camcorder; RGB for a DSC) need to be measured so that the individual camera's response to known colors can be quantized.
Color cameras require multiple classes of sensors with different spectral sensitivities. By placing CFA's in series with sensors, usually on a pixel-by-pixel basis, such multiple classes can be created. When the color filters are placed in a mosaic pattern, one color per pixel, the cameras are referred to as CFA cameras.
Cameras rely on illumination to capture information of an object scene. Cameras are utilized and calibrated under a variety of illuminants. Estimating how information obtained from under one illuminant would transform under another illuminant is a challenge.
Sensor characteristics, which consist of electronic sensor, color filter and optical lens, are the critical parts in the design of digital color cameras. A camera is the input end in a color input-output system. Thus, its capability to acquire precise signals under a noisy environment can make significant contributions to the processing and output image quality.
More and more attention is being paid to spectral-based approaches, where the input signal and the output signal are all treated as a spectral power distribution. For example, the object surface reflectance is captured spectrally, and rendered or printed spectrally to match the original surface characteristics.
Spectral sensitivity may be obtained via direct measurement of a particular camera output. The spectral sensitivity functions can then be used to determine the mapping relationship between device output signals and object color perception values for any samples under any interested illuminant. Direct measurement of spectral sensitivities with a spectroradiometer and monochromator gives rather accurate results. It is well known how a digital camera can be calibrated if the SS curve for each camera is known. Unfortunately, directly measuring the SS curve from a set of cameras is both time-consuming and expensive.