1. Field of the Invention
The present invention relates to an image processing apparatus and method and, more particularly, to an image processing apparatus and method for magnifying a digital multivalued image.
2. Description of the Related Art
An apparatus of this type associated with digital binary images has already been disclosed in Japanese Patent Laid-Open No. 5-174140.
In this proposal, when a binary image is to be subjected to magnifying processing, the contour information of the character/line drawing components of the binary image is extracted, and a magnified image is generated on the basis of the extracted contour information instead of magnifying the binary image itself. It is an object of the proposal to obtain a binary image with high quality.
More specifically, in Japanese Patent Laid-Open No. 5-174140, outline vectors are extracted from a binary image, and smoothing processing is performed with respect to the extracted outline vector data. The smoothed outline vector data is magnified at a desired (arbitrary) magnification. Contours are then drawn and the areas defined by the contours are painted to reproduce a binary image. With this operation, a high-quality digital binary image magnified at the desired magnification (arbitrary) is obtained.
The main part of this proposal will be briefly described below. FIG. 56 is a block diagram best representing the characteristic features disclosed in Japanese Patent Laid-Open No. 5-174140. Referring to FIG. 56, a binary image acquisition unit 101 acquires a digital binary image to be subjected to magnifying processing, and outputs a raster scanning type binary image. An outline extraction unit 102 extracts coarse contour vectors (outline vectors before smoothing/magnifying processing) from the raster scanning type binary image. An outline smoothing/magnifying unit 103 performs smoothing/magnifying processing for the coarse contour vector data in the form of vector data. A binary image reproduction unit 104 reproduces raster scanning type binary image data from the outline vector data. A binary image output unit 105 is a printer or display device for displaying the raster scanning type binary image data, producing a hard copy, or outputting the data to a communication line or the like.
For example, the binary image acquisition unit 101 is constituted by a known binary image input device for reading an original image as a binary image and outputting the read image as binary data in the raster scanning form.
For example, the outline extraction unit 102 is constituted by a device disclosed in Japanese Patent Laid-Open No. 4-157578 previously proposed by the assignee of the present application. FIG. 57 shows a scanning form for raster scanning type binary image data output from the binary image acquisition unit 101, and also a scanning form for raster scanning type binary image data received by the outline extraction unit 102. In this form, the outline extraction unit 102 receives raster scanning type binary image data output from the binary image acquisition unit 101. Referring to FIG. 57, a pixel 111 is a given pixel of a binary image during a raster scanning operation, and an area 112 is a 9-pixel area including eight pixels adjacent to the pixel 111. The above outline extraction unit disclosed in Japanese Patent Laid-Open No. 4-157578 switches target pixels in the raster scanning order, and detects a contour side vector (horizontal or vertical vector) between each target pixel and each adjacent pixel in accordance with the state of each pixel (white or black pixel) in a 9-pixel area like the area 112. If a contour side vector is present, the outline extraction unit extracts the start coordinates and direction data of the side vector, and sequentially extracts coarse contour vectors while updating the relationship in connection between these side vectors.
FIG. 58 shows a state wherein contour side vectors between a target pixel and pixels adjacent to the target pixel are extracted. Referring to FIG. 58, a mark ".DELTA." represents the start point of a vertical vector (or the end point of a horizontal vector), and a mark "O" represents the start point of a horizontal vector (or the end point of a vertical vector).
FIG. 59 shows coarse contour vector loops extracted by the outline extraction unit described above. In this case, each square defined by the matrix indicates the pixel position of an input image; each blank square, a white pixel; and each hatched mark ".circle-solid.", a black pixel. Similar to FIG. 58, each mark ".DELTA." represents the start point of a vertical vector; and each mark "O", the start point of a horizontal vector.
As is apparent from the case shown in FIG. 59, the outline extraction unit 102 extracts areas where pixels are coupled to each other as coarse contour vector loops including horizontal and vertical vectors which always appear alternately and continuously, although the horizontal and vertical vectors differ in length. Note that in this case, extraction processing is performed such that a black pixel area is located on the right side with respect to the direction of the extraction processing. In addition, the start point coordinates of the coarse contour vectors are extracted as the middle positions between the respective pixels of the input image. That is, when the position of each pixel is expressed by integers (x,y), the start point of an extracted vector is expressed by values obtained by adding or subtracting 0.5 to or from the respective coordinate values. More specifically, one pixel in an original image is determined as a pixel (rectangle) having a significant area and extracted as a coarse contour loop.
The coarse contour vector group extracted in this manner is output from the outline extraction unit 102 in FIG. 56 according to a data format like the one shown in FIG. 60. That is, the coarse contour vector group is constituted by a total number n of coarse contour loops extracted from an image, and a group of coarse contour loop data of the first contour loop to the ath contour loop. Each coarse contour loop data is constituted by the total number of the start points of contour side vectors (equivalent to the total number of contour side vectors) present in the coarse contour loop, and a string of the values (the start points of horizontal and vertical vectors are alternately arranged) of the start point coordinates (x- and y-coordinate values) of the respective contour side vectors in the order of constituting the loop.
The outline smoothing/magnifying unit 103 shown in FIG. 56 receives the coarse contour vector data (see FIG. 60) output from the outline extraction unit 102. The unit 103 then performs smoothing processing and magnifies the data at a desired magnification in the form of outline vector data (coordinate values).
FIG. 61 shows the arrangement of the outline smoothing/magnifying unit 103 in more detail. Referring to FIG. 61, a first smoothing/magnifying unit 152 smoothes and magnifies input coarse contour data at a magnification set by a magnification setting unit 151. A second smoothing unit 153 further performs smoothing of the processing result to obtain a final output. The magnification setting unit 151 may supply a value set by a DIP switch, a dial switch, or the like in advance to the first smoothing/magnifying unit 152, or may supply a value externally provided via an I/F (interface) to the first smoothing/magnifying unit 152. The magnification setting unit 151 is a unit for providing information designating specific magnifications respectively in the main scanning (horizontal) direction and the subscanning (vertical) direction with respect to an image size supplied as input data.
The first smoothing/magnifying unit 152 receives magnification information from the magnification setting unit 151 and performs smoothing/magnifying processing.
FIG. 62 shows a hardware arrangement for realizing outline smoothing/magnifying processing. Referring to FIG. 62, a ROM 164 stores operation procedures and the like executed by a CPU 161.
An output from the outline extraction unit 102 in FIG. 56 is stored, as a file (coarse contour vector data), in a disk unit 162 according to the data format shown in FIG. 60.
The CPU 161 operates in accordance with the procedure shown in FIG. 63 to execute outline smoothing/magnifying processing.
Referring to FIG. 63, in step S170, the CPU 161 reads out coarse contour data from the disk unit 162 via a disk I/O 163, and loads it in a working memory area (not shown) of a RAM 166. In step S171, the CPU 161 performs first smoothing/magnifying processing.
The first smoothing processing is performed for each closed loop of the coarse contour data. Each contour side (horizontal or vertical vector) vector of each coarse contour data is sequentially set as a target vector, and at most three continuous side vectors before and after each target contour side vector (i.e., at most a total of seven side vectors, i.e., three sides before the target side, the target side itself, and three sides after the target side) are classified into a pattern according to a combination of the lengths and directions of the continuous side vectors. With respect to each pattern, the CPU 161 then outputs additive information (to be referred to as corner point information hereinafter) indicating the coordinate value of each contour point after the first smoothing processing and whether the contour point is a point at a corner. In this case, a "point at a corner" means a point located at a significant corner, i.e., a point which is not changed by smoothing processing. Points other than corner points are determined as points derived from noise and points derived from jagged portions and notches caused by other factors. A contour point after the first smoothing processing which is determined as a corner point is treated as a point which is not smoothed by the second smoothing processing, i.e., treated as a fixed point. In other words, a contour point (to be referred to as a non-corner point hereinafter) after the first smoothing processing which is not determined as a corner point is further smoothed by the second smoothing processing.
FIG. 64 shows this state, i.e., a target coarse contour side vector D.sub.i, three side vectors D.sub.i-1, D.sub.i-2, and D.sub.i-3 before the target coarse contour side vector, three side vectors D.sub.i+1, D.sub.i+2, and D.sub.i+3 after the target coarse contour side vector, and a contour point after the first smoothing processing which is defined with respect to the target edge D.sub.i. That is, a vector (an oblique vector is allowed) connecting contours redefined in this manner is generated.
The contents of the first smoothing processing have been described above. Data after the first smoothing processing are sequentially created on a predetermined area of the RAM 166. As a result, the vector data after the first smoothing processing allows oblique vectors with respect to the coarse contour vector data, and hence an image without any jagged portion can be generated. Upon completion of the processing in step S171 in FIG. 63, the CPU 161 performs the second smoothing processing in step S172.
In the second smoothing processing, data after the first smoothing processing is input and processed. That is, the CPU 161 receives data indicating the number of closed loops, data indicating the number of contour points of each closed loop, the coordinate value data string of contour points of each closed loop after the first smoothing processing, and the additive information data string of the contour points of each closed loop after the first smoothing processing, and outputs contour point data after the second smoothing processing.
As shown in FIG. 65, the contour data after the second smoothing processing is constituted by the number of closed loops, a contour point count table for each closed loop, and the coordinate value data string of the contour points of each closed loop after the second smoothing processing.
The second smoothing processing will be briefly described below with reference to FIG. 66. Similar to the first smoothing processing, in the second smoothing processing, processing is performed in units of contour loops, while processing is performed in units of contour points in each contour loop.
If a target contour point is a corner point, an input contour point coordinate value is used as contour point coordinate data which has undergone the second smoothing processing with respect to the target contour point. That is, no change is made.
If a target contour point is a non-corner point, a coordinate value obtained from the weighted mean of the coordinate values of contour points before and after the target contour point and the coordinate value of the target contour point is used as a contour point coordinate value which has undergone the second smoothing processing with respect to the target contour point. More specifically, letting P.sub.i (x.sub.i,y.sub.i) be the target input contour point as the non-corner point, P.sub.i-1 (X.sub.i-1,y.sub.i-1) be the contour point immediately before the target contour point P.sub.i in the input contour loop, P.sub.i+1 (x.sub.i+1,y.sub.i+1) be the contour point immediately after the target contour point P.sub.i, and Q.sub.i (x'.sub.i,y'.sub.i) be the contour point which has undergone the second smoothing processing with respect to the target input contour point P.sub.i, EQU x'.sub.i =k.sub.i-1.multidot.x.sub.i-1 +k.sub.i.multidot.x.sub.i +k.sub.i+1.multidot.x.sub.i+1 Y'.sub.i =k.sub.i-1.multidot.Y.sub.i-1 +k.sub.i.multidot.y.sub.i +k.sub.i+1.multidot.y.sub.i+1 (1)
In this case, k.sub.i-1 =k.sub.i+1 =1/4 and k.sub.i =1/2.
Referring to FIG. 66, points P0, P1, P2, P3, and P4 form a part of a series of continuous contour points, as input data, which have undergone the first smoothing processing, points P0 and P4 are corner points, and points P1, P2, and P3 are non-corner points. The processing results obtained at this time are respectively indicated by points Q0, Q1, Q2, Q3, and Q4. Since the points P0 and P4 are corner points, their coordinate values are used as the coordinate values of the points Q0 and Q4 without any change. The point Q1 has coordinate values calculated from the points P0, P1, and P2 according to the above equations. Similarly, the points Q2 and Q3 respectively have coordinate values calculated from the points P1, P2, and P3 and the points P2, P3, and P4 according to the above equations.
The CPU 161 performs the second smoothing processing with respect to the contour data in a predetermined area of the RAM 166, which has undergone the first smoothing processing, in the above-described manner. The second smoothing processing is sequentially performed for each loop in an order of the first loop, the second loop, the third loop . . . When processing with respect to all loops is completed, the second smoothing processing is ended. In each loop, processing is sequentially performed in an order of the first point, the second point, the third point . . . When the processing indicated by equations (1) is completed with respect all the contour points in one loop, the processing of this loop is ended, and the next loop is processed.
Assume that L contour points are present in a loop. In this case, a point before the first point is the Lth point. In other words, a point after the Lth point is the first point. The contour point data obtained by the second smoothing processing has the same total number of loops as that of the contour data after the first smoothing processing, with the number of contour points on each loop remaining the same. The CPU 161 outputs the above result to another area of the RAM 166 or the disk unit 162 according to the format shown in FIG. 65, and completes the second smoothing processing (step S172).
The flow then advances to step S173 to transfer the data obtained by the second smoothing processing to a binary image reproduction unit 104 via the I/O 163, thereby completing the series of operations shown in FIG. 56.
For example, the binary image reproduction unit 104 can be constituted by a device disclosed in Japanese Patent Laid-Open No. 5-20467 proposed by the assignee of the present application. This device can output a binary image in the raster scanning form, which image is generated by painting an area enclosed with a vector graphic pattern expressed by contour data, obtained by the second smoothing processing and transferred via an I/O, on the basis of the contour data. In addition, according to the description of this proposal, binary image data is visualized by using a binary image output unit such as a video printer. The proposal disclosed in Japanese Patent Laid-Open No. 6-12490 is an improvement on Japanese Patent Laid-Open No. 5-174140 described above. This proposal aims at preventing an image magnified at a low magnification from excessive thickening. More specifically, the outline extraction unit in Japanese Patent Laid-Open No. 5-174140 extracts a vector at a position located between white and black pixels and closer to the black pixel (i.e., a black pixel area is set to be smaller in width than a white pixel area), and outline smoothing is performed in accordance with this vector extraction.
According to Japanese Patent Laid-Open No. 5-174140 described above, good magnified images can be obtained even from binary images, such as characters, line drawings, tables, and graphic patterns, in particular.
In general, a multivalued digital image is magnified by performing interpolation processing using the pixel values of the multivalued image. This interpolation processing is performed by a known magnifying method such as a method of interpolating pixel values between the pixels of a sampled original image by using a bilinear function or a method of performing interpolation by approximating a sampling function with a cubic expression. For example, such methods are described in Tamura Hideyuki, "Introduction to Computer Image Processing" Souken Shuppan (1985).
In the above technique of magnifying a multivalued digital image by interpolating the pixel values of the image with attention being paid to only the pixel values of the original image, a deterioration in image quality, e.g., a jagged pattern or lattice-like distortion, may occur.