Technical Field
This invention relates to image editing systems and methods. More particularly, this invention relates to image editing systems and methods that work with images stored in digital format.
Description of Related Art
The “dynamic range” of a scene is the contrast ratio between its brightest and darkest parts. A plate of evenly lit mashed potatoes outside on a cloudy day has a very low dynamic range. The interior of an ornate cathedral with light streaming in through its stained-glass windows has a very high dynamic range. Scenes in which light sources can be seen directly also usually have a very high dynamic range.
As is well known, there are variety of systems and methods that edit images. Examples of edit operations on images include: cropping, rotating, translating, convolving, cloning, retouching, painting or re-sampling.
In more recent times, computers have been used for image editing. The image is typically represented by a set of pixels. Each pixel is typically assigned values representing its color and intensity.
Unfortunately, most digital editing systems have a low dynamic range, meaning that the contrast ratio between the lightest and darkest pixel that they can display is small. In many systems, for example, there are only 256 distinct intensity levels that can be specified, e.g., from 0 to 255.
In the real world, on the other hand, there is no theoretical limit to the dynamic range of an image. Most images, moreover, have a dynamic range in excess of 100,000. Some have dynamic ranges in excess of 1,000,000. To be sure, most people are capable of distinguishing between dynamic ranges far, far in excess of the limited range of 255 that is found on many computer systems.
When a high dynamic range image is edited by a low dynamic range image editing system, the high dynamic range image is typically converted into an image file having the lower dynamic range of the image editing system. Important variations at very low intensities and at very high intensities are usually lost in the conversion process.
A still further problem with existing low dynamic range image editing systems is that the differences in assigned pixel intensity values often fail to bare a linear relationship to the differences in the intensity levels of the portions of the actual scene that these pixels represent. These non-linearities cause further distortions, in addition to the loss of information that occurs when a high dynamic range image is converted to the low dynamic range needed for the image editing system.
A still further problem with image editing systems is that the editing process is often slowed by repeated applications of a non-linear tone mapping curve or function, such as a gamma correction curve. As is well known, display systems are often adjusted to better match differences in the intensity levels of the real life image using a non-linear tone mapping curve or function, such as a gamma correction curve. Unfortunately, this typically involves complex computations that take significant time to compute during the editing process.