1. Field of the Invention
The present invention relates to a color-reduction processing technique of a color image.
2. Description of the Related Art
In recent years, along with digitization of information, a system in which a paper document is scanned by, for example, a scanner so it may be converted into digital data, while the digital data is stored or transmitted to another apparatus without being preserved intact has prevailed. Also, colors used in a document itself transit from monochrome binary values to color values (multi-values). A digital document is required to clearly reproduce characters and a background comprised of a uniform color, so as to attain high readability and compression. However, upon digitizing a paper document in practice, various kinds of noise, blurring, and nonuniformity are generated. In particular, since the nonuniformity and blurring of color regions of a color document adversely influence readability and image quality, color-reduction processing is executed by selecting representative colors from the image.
Watanabe “A Fast Algorithm for Color Image Quantization Using Only 256 Colors”, The Transactions of the Institute of Electronics, Information and Communication Engineers 70(4), pp. 720-726, 1987-04 (non-patent literature 1) discloses a method of color-reducing an input image to a 256-color image. An RGB color space is divided into 4×4×4 cube areas, and a color occurrence frequency distribution (histogram) for each of the 4×4×4 cube areas is calculated by referring to the upper 2 bits of an input digital color signal (8 bits for each of RGB (red, green, and blue) signals). Next, variance values are calculated for each of the 64 cube areas, and each of 32 cube areas having the large variance is divided into two. Furthermore, an operation for calculating variance values of the divided areas and dividing the areas having large variance into two is repeated until the number of areas reaches 256. Furthermore, the average colors calculated from each of the divided 256 areas are determined as representative colors.
International Publication WO2006/066325 (patent literature 1) discloses a technique that divides an input image into a plurality of blocks, and performs color substitution for respective blocks. Furthermore, Japanese Patent Laid-Open No. 2006-333175 (patent literature 2) discloses a technique that applies region recognition to the entire page, and executes color-reduction processing using the predetermined number of colors for each attribute such as “text” or “photo” after the recognition result is stored.
However, the color-reduction method described in non-patent literature 1 comes to have much computational complexity because the variance calculation process and the color space division process must be executed repeatedly. Since an RGB color space is used, and variances of respective color components are calculated for each of areas, complicated calculations are required. Patent literature 1 executes color reduction for each tile so as to hold color information of a local region. However, since a small region of interest is set at the time of color reduction, and color information of the small region is left, the technique disclosed in this literature is susceptible to the influence of noise such as color blurring. Hence, in order to remove noise, a histogram is generated to apply color quantization processing, and noise is determined by checking the shapes and generated positions of quantized regions. Then, a histogram is generated again to apply quantization processing. That is, since a plurality of times of histogram calculations with large computation volumes are required, computation cost increases. Furthermore, patent literature 2 executes color-reduction processing after the region recognition is applied to the entire page. For this reason, the recognition result has to be stored on a memory. The technique disclosed in this literature allows easy software implementation, but it is not suited to hardware implementation based on parallel processing and sequential processing.