The present invention relates to an image processing apparatus and method and a transmission medium, and, more specifically, to an image processing apparatus and method and a transmission medium in which one of a plurality of rendering modes is selected and set and a rendering process is executed in accordance with the rendering mode thus set.
In such image processing apparatus as a computer gate machine, an image is prescribed by a combination of a plurality of polygonal regions and the entire object is rendered on a polygon-by-polygon basis. Such image processing apparatuses can display, on a monitor, polygons in a virtual space while changing their states in various manners.
FIG. 1 is a block diagram showing an example configuration of an image processing apparatus of the above kind. In this example, the CPU 11 performs various operations such as a coordinate conversion, a light source calculation, and a vector operation as well as controls the respective sections. A main bus 12 that transfers data at relatively high speed and a sub-bus 13 that transfers data at relatively low speed are connected to the CPU 11. The CPU 11 can exchange data via the buses 12 and 13. A CD-ROM drive 14, which is connected to the sub-bus 13, can read out any of various data or programs from a CD-ROM as a recording medium that is mounted therein in accordance with an instruction that is sent from the CPU 11.
A main memory 15 and a GPU (graphic processing unit) 16 are connected to the main bus 12. The main memory 15 stores data that has been read out from the CD-ROM drive 14, data as an operation result of the CPU 11, and other data. The GPU 16 performs a rendering operation while reading out data from the main memory 15 when necessary, and stores processed image data in a VRAM (video random access memory) 17. Further, the GPU 16 reads out image data from the VRAM 17 and supplies it to a D/A converter 18. The D/A converter 18 converts image data (digital signal) that is supplied from the GPU 16 into an analog signal and outputs it to a monitor (not shown) as a video signal.
In rendering a blended image by superimposing two images on each other in the above image processing apparatus, if one of the two images has a translucent region, such as a polygonal region, pixel data of the two images are blended together by using alpha data that are added to the image data (color data) of the respective subject images. The alpha data is a coefficient that takes a value in a range of 0.0 to 1.0. A value 1.0 is added to an opaque polygon, and a value 0.0 is added to a transparent polygon. A value in a range of 0.0 to 1.0 is added to a translucent polygon (the degree of transparency increases as the value becomes closer to 0.0, and decreases as the value becomes closer to 1.0).
For example, when a translucent image G is superimposed on an opaque image F, blended pixel data Cb is given by
Cb=Asxc2x7Cs+(1xe2x88x92As)Cdxe2x80x83xe2x80x83(1)
where a notation is employed that Cd and Ad are pixel data and alpha data of the image F and Cs and As are pixel data and alpha data of the image G.
This type of process is called alpha blending.
FIG. 2 is a block diagram showing a more detailed example configuration of the GPU 16 and the VRAM 17 as circuits for performing alpha blending. In this example, the GPU 16 is composed of an interpolation circuit 21 and an alpha blending circuit 22 and the VRAM 17 has a Z buffer 31 and a frame buffer 32.
The interpolation circuit 21 performs interpolation on a polygon that has been read out from the main memory 15 (see FIG. 1), and supplies pixel data Cs of the interpolated polygon to the alpha blending circuit 22 and supplies depth data Zs and alpha data As of the interpolated polygon to the VRAM 17. The alpha blending circuit 22 generates pixel data Cb by blending pixel data Cd of a polygon that is stored in the VRAM 17 and the pixel data Cs that is supplied from the interpolation circuit 21 by using the alpha data that is supplied from the interpolation circuit 21. The alpha blending circuit 22 outputs the generated image data Cb to the VRAM 17.
The Z buffer 31 stores one of depth data Zs that is supplied from the interpolation circuit 21 of the GPU 16, the one depth data Zs having a larger value (indicating that the image is located closer to the viewer""s side). The frame buffer 32 stores image data of an image to be displayed on the monitor. The VRAM 17 stores alpha data As that is supplied from the interpolation circuit 21 in a predetermined area.
FIG. 3 illustrates an example process of rendering a blended image by superimposing two images on each other by using the alpha blending circuit shown in FIG. 2. In this example, it is assumed that pixel data Cd and alpha data Ad of an image 110 are stored in the VRAM 17 in advance and that an image 120 is to be superimposed on the image 110. In FIG. 3, pixel data of polygons that define each image are shown on the left side and corresponding alpha data values are shown on the right side.
In this example, a polygonal region 112 of the image 110 is opaque and the value of the corresponding alpha data Ad is 1.0. A region 122 of the image 120 is translucent and the value of its alpha data As is 0.5. A region 111 of the image 110 and a region 121 of the image 120 are transparent, and the values of their alpha data Ad and As are 0.0.
The alpha blending circuit 22 generates pixel data Cb by blending the pixel data Cd and Cs of the images 110 and 120 by using the alpha data As of the image 120 (refer to Equation (1)). The generated pixel data Cb and the alpha data As are rendered in (written to) the VRAM 17 as an image 130. The alpha data of the rendered image is newly denoted by Ad. The value of the pixel data Cb of the image 130 is the same as that of the pixel data Cd of the image 110 in regions 131 and 132, and is equal to (0.5 Cs+0.5 Cd) in a region 133.
In this case, although blending is correctly done for the pixel data Cb, the alpha data Ab is rendered as being the same as the alpha data As of the image 120. Since the opaque region 112 and the translucent region 122 have been superimposed on each other, the value of the alpha data Ab corresponding to the regions 132 and 133 should be 1.0 indicating that those regions are opaque. Therefore, in this case, the rendering operation has not been performed correctly.
Incidentally, a method called an alpha test may be used in an image drawing process of the above kind. FIG. 4 is a block diagram showing an example configuration of the GPU 16 and the VRAM 17 in which a circuit for performing an alpha test is provided. The components in FIG. 4 that have corresponding components in FIG. 2 are given the same reference numerals as in FIG. 2 and descriptions therefor will be omitted where appropriate. In this example, an alpha test circuit 23 is provided between the interpolation circuit 21 and the VRAM 17. A predetermined constant C is set in the alpha test circuit 23. The alpha test circuit 23 compares the value of alpha data As that is supplied from the interpolation circuit 21 with the value of the constant C, and judges whether a comparison result satisfies a predetermined condition. In accordance with a judgment result, the alpha test circuit 23 makes a selection as to whether to render blended pixel data Cb that has been generated by the alpha blending circuit 22 and the alpha data As in the VRAM 17.
FIG. 5 illustrates an example process of rendering a blended image in which the value of the constant C of the alpha test circuit 23 shown in FIG. 4 is set at 1.0 and the condition is set to xe2x80x9cEQUAL.xe2x80x9d The parts in FIG. 5 that have corresponding parts in FIG. 3 are given the same reference symbols as in FIG. 3 and descriptions therefor will be omitted where appropriate. In this example, the alpha test circuit 23 compares the values of alpha data As of an image 120 with the value of the constant C, and judges whether they are equal to each other. In this case, since alpha data As corresponding to regions 121 and 122 of an image 120 have values 0.0 and 0.5, there is no alpha data As whose value is equal to the value 1.0 of the constant C. Therefore, comparison results do not satisfy the condition EQUAL (1.0), and the alpha test circuit 23 gives a write disable instruction WD to the VRAM 17 so that blended pixel data Cb that has been generated by the alpha blending circuit 23 and the alpha data As are not rendered in the VRAM 17. As a result, a blended image 140 is the same as the image 110. That is, in this case, the blended image rendering operation has not been performed correctly.
FIG. 6 illustrates an example process of rendering a blended image in which the value of the constant C of the alpha test circuit 23 shown in FIG. 4 is set at 0.0 and the condition is set to xe2x80x9cNOT-EQUAL.xe2x80x9d The parts in FIG. 6 that have corresponding parts in FIG. 3 are given the same reference symbols as in FIG. 3 and descriptions therefor will be omitted where appropriate. The alpha test circuit 23 compares the values of alpha data As of an image 120 with the value of the constant C. Since the value of the alpha data As corresponding to a region 122 satisfies the condition NOT-EQUAL, the alpha test circuit 23 gives a write enable instruction WE to the VRAM 17, whereby blended pixel data Cb that has been generated by the alpha blending circuit 22 and the value 0.5 of the alpha data As are rendered in the VRAM 17 as an image 150.
Since regions 152 and 153 of the rendered image 150 have been generated by superimposing an opaque region 112 of an image 110 and the translucent region 122 of the image 120, alpha data Ab corresponding to the regions 152 and 153 should have a value 1.0 indicating that those regions are opaque. However, in this case, a value 0.5 indicating that the region 153 is translucent is rendered. Therefore, the rendering operation has not been performed correctly.
Next, with reference to FIG. 7, a description will be made of an example of an image rendering process in which image depth is taken into consideration. This is a process called a Z buffer method in which values of depth data are compared with each other on a pixel-by-pixel basis based on depth data that is added to respective pixels and, when a plurality of pixels are to be superimposed one on another, only a pixel that is closest to the viewer""s side is rendered and the other pixels behind that pixel are not rendered. In FIG. 7, pixel data of each image are shown on the left side, corresponding alpha data values are shown at the center, and a depth data value is shown on the right side. Pixels exist closer to the viewer""s side in the virtual space as the depth data value increases, and exist at a deeper position as it becomes closer to 0.
In this example, alpha data Ad of regions 161 and 162 of an image 160 have respective values 1.0 (opaque) and 0.5 (translucent), and alpha data As of regions 171 and 172 of an image 170 have a value 1.0 (opaque). Depth data Zd and Zs of the images 160 and 170 have respective values 100.0 and 50.0, which means that the image 160 is closer to the viewer""s side than the image 170.
First, the respective data of the image 160 is rendered in the VRAM 17. Then, the pixel data of the image 170 is blended with that of the image 160. However, a comparison between the value of the depth data Zd of the image 160 that is stored in the Z buffer 31 and the depth data Zs of the image 170 shows that the depth data Zs of the image 170 is smaller than the depth data Zd of the image 160 and hence the image 170 is deeper than the image 160. As a result, the respective data of the image 170 is prohibited from being rendered in the VRAM 17. Therefore, in this case, an image 180 as a blended result remains the image 160 though correctly it should be such that the image 170 is seen through the translucent region 162 of the image 160. That is, the image rendering operation has not been performed correctly.
Next, with reference to FIG. 8, a description will be made of an example process in which an opaque polygonal region of an image is rendered as a transparent object. In this example, it is assumed that an image 400 is stored in the VRAM 17 in advance. Both of alpha data Ad and depth data Zd of the image 400 have a value 0.0. Depth data Zs corresponding to pixel data Cs of regions 311 and 312 of an image 310-A has respective values 100.0 and 0.0, and alpha data As of the regions 311 and 312 has respective values 1.0 (opaque) and 0.0 (transparent). An assumption is made that it is desired that the opaque region 311 be used as a transparent object. Alpha data As and depth data Zs of an image 320 have values 1.0 and 50.0, respectively.
In this case, it is assumed that neither an alpha blending process nor an alpha test process is executed. First, the pixel data Cs, the alpha data As, and the depth data Zs of the image 310-A are rendered in the VRAM 17, whereby an image 310-B is obtained. Then, the pixel data Cs, the alpha data As, and the depth data Zs of the image 320 are rendered on those of the already rendered image 310-B. There is an exception that since the value 100.0 of the depth data Zd corresponding to the region 311 of the image 310-B that is stored in the Z buffer 31 is larger than the value 50.0 of the depth data Zs of the image 320, rendering of the image 320 is prohibited in the region 311. Alpha data Ab of a region 501 of a thus-blended image 500 has a value 1.0 (opaque) and hence the region 501 is not expressed as a transparent object. In order to make the region 501 transparent, it is necessary to make the alpha data corresponding to the region 311 of the image 310-A have a value 0.0, that is, to make the region 311 transparent.
Conventional image processing apparatuses, in which processes of rendering an image including a translucent portion are executed in the above manners, have a problem that pixel data and alpha data of a blended image are not correlated with each other correctly.
Where a rendering process is executed according to the Z buffer method in such a manner that two images, one of which is closer to the viewer""s side than the other, are superimposed on each other, there may occur, depending on the order of rending the images in the VRAM 17, a case that an image including a translucent portion cannot be rendered.
When it is desired to use an opaque polygon as a transparent polygon, the value of corresponding alpha data needs to be converted into a value 0.0 indicating that the polygon is transparent. This causes a problem that the same data cannot be used successively.
The present invention has been made in view of the above circumstances, and an object of the present invention is therefore to make it possible to perform an image rendering operation correctly by using an alpha test.
According to a first aspect of the invention, there is provided an image processing apparatus comprising means for selecting one of a plurality of image rendering modes; means for judging whether additional information of pixel data of a region that defines an image satisfies a predetermined condition; means for rendering the pixel data and the additional information if the judging means judges that the additional information satisfies the condition; and means for executing a rendering process in accordance with the selected rendering mode if the judging means judges that the additional information does not satisfy the condition.
According to a second aspect of the invention, there is provided an image processing method comprising the steps of selecting one of a plurality of image rendering modes; judging whether additional information of pixel data of a region that defines an image satisfies a predetermined condition; rendering the pixel data and the additional information if the judging step judges that the additional information satisfies the condition; and executing a rendering process in accordance with the selected rendering mode if the judging step judges that the additional information does not satisfy the condition.
According to a third aspect of the invention, there is provided a transmission medium for transmission of a program comprising the steps of selecting one of a plurality of image rendering modes; judging whether additional information of pixel data of a region that defines an image satisfies a predetermined condition; rendering the pixel data and the additional information if the judging step judges that the additional information satisfies the condition; and executing a rendering process in accordance with the selected rendering mode if the judging step judges that the additional information does not satisfy the condition.
In the above image processing apparatus and method and the transmission medium according to the respective aspects of the invention, one of a plurality of image rendering modes is selected, the pixel data and the additional information are rendered if it is judged that the additional information satisfies the condition, and a rendering process is executed in accordance with the selected rendering mode if it is judged that the additional information does not satisfy the condition.