1. Field of the Invention
This invention relates to texture mapping devices which are employed in computer graphics systems that visually display objects in a three-dimensional manner as collections (or aggregation) of polygons. Herein, the texture mapping device is designed to perform texture mapping so that textures representing graphics patterns (such as grain patterns of woods) are put onto interior areas of the polygons which form a surface of the object displayed on a screen of a display device.
2. Prior Art
In computer graphics, an important problem is how to obtain images (or pictures) having a reality. To solve the problem, it is necessary to provide a technology of displaying an object in a three-dimentional manner. According to this technology, the object should be displayed in such a way that the people feel as if the object realy exists in a three-dimensional space. In addition, a surface of the object should be displayed with a reality. In general, a surface of an actual object which actually exists in the world of nature has colors and brightness. In addition, the actual object has patterns and a material feel that the surface thereof is felt rough or smooth. Further, a person feels in the sense of sight that the surface of the actual object may be changed in visual states in response to conditions such as positions and attitude. Such a change in the visual states of the surface of the object may construct a reality that the actual object possesses. In short, the reality of the displayed object depends on the display technique how to display the surface of the object with a reality.
The texture mapping technology is introduced to obtain displayed images that are capable of showing the visual states of the three-dimensional objects with a reality. Simply speaking, the texture mapping technology performs image processing that image patterns (or graphics patterns) called textures are put onto polygons which form the surface of the object which is displayed in a three-dimentional manner on a screen of a display device.
Next, a summary of the texture mapping technology will be simply described below.
FIG. 4 shows that a triangle ABC representing an example of the polygon is placed in a three-dimensional space constructed by 3 axes x, y and z. This triangle ABC shows an example of a polygon which forms a surface of a display object. In FIG. 4, VP designates a viewpoint. When taking a view from the viewpoint VP in a direction of the axis z, it is possible to obtain a projective image on a x-y plane defined by the axes x and y in response to the polygon. The projective image is displayed on a screen of the display device. In FIG. 4, a triangle consisting of three vertices SA, SB and SC corresponds to the projective image which is projected onto the x-y plane in response to the triangle ABC. That is, the triangle SA-SB-SC is a projective image of the triangle ABC which should be displayed on the screen of the display device, wherein the vertices SA, SB, SC respectively correspond to A, B, C of the triangle. In order to perform texture mapping on the surface of the display object, an image of the texture should be subjected to texture mapping into an interior area of the triangle SA-SB-SC within the screen of the display device.
To perform the texture mapping described above, texture data representing images of textures are stored in a memory, called a texture memory, in advance. FIG. 5A shows a two-dimensional coordinates system (hereinafter, referred to as a texture coordinates system) defined by coordinate axes tx and ty, wherein a triangle consisting of vertices TA, TB and TC is shown. This triangle TA-TB-TC is a right front view representing a part of the surface of the display object. The aforementioned texture data define colors and brightness (or luminance) with respect to segments of the image of the texture within the texture coordinates system. In order to obtain texture data with respect to a desired segment of the texture, it is necessary to access the texture memory in response to a read address. So, texture data of the desired segment of the texture are read from the texture memory. Herein, the read address is made by coordinates which are read from the texture coordinates system with regard to the desired segment. Incidentally, the coordinates will be referred to as a texture address (i.e., the aforementioned read address).
The display device displays images on the screen thereof on the basis of image data stored in a display memory. To put the texture onto an interior area of the triangle SA-SB-SC within the screen of the display device, texture data corresponding to the texture should be written into the display memory as image data which represent images placed inside of the triangle SA-SB-SC. To achieve the above, following operations are performed.
FIG. 5B shows a x-y coordinates system (referred to as a screen coordinates system) wherein a triangle consisting of vertices SA, SB, SC is shown. The vertices SA, SB, SC shown in FIG. 5B respectively correspond to the vertices TA, TB, TC shown in FIG. 5A. So, texture data corresponding to the vertices TA, TB, TC are read from the texture memory and are then written at addresses of the display memory which correspond to the vertices SA, SB, SC. Thus, it is possible to perform texture mapping in an apparent manner with respect to the vertices SA, SB, SC only by use of the texture data of the vertices TA, TB, TC.
However, it may be uncertain to define relationships between points of the screen other than the vertices SA, SB, SC and points of the texture coordinates system of FIG. 5A. In other words, it is uncertain to define addresses by which texture data are read from the texture memory in accordance with the points of the screen. To solve such uncertainty, the system of the conventional technology employs a mapping method using interpolation calculations. Herein, steps of the mapping method are mainly classified into three processes (a), (b) and (c), as follows:
(a) The screen of the display device is constructed by a number of horizontal scanning lines. Herein, a certain set of horizontal scanning lines which traverse the triangle consisting of the vertices SA, SB, SC are selected from among the horizontal scanning lines constructing the screen of the display device. Points of intersection are formed between sides of the triangle and the horizontal scanning lines traversing the triangle. So, the system detects x-y coordinates from the screen coordinates system with respect to each point of intersection. Incidentally, the x-y coordinates will be referred to as a screen address. FIG. 5B shows line segments h1 to h6 which are parts of the horizontal scanning lines placed inside of the triangle SA-SB-SC on the screen. Points of intersection emerge at intersections formed between ends of the line segment and sides of the triangle. So, addresses are detected with respect to those points of intersection. For convenience' sake, a point of intersection which is formed between a left end of the line segment and a side of the triangle is called a start point, whilst a point of intersection which is formed between a right end of the line segment and a side of the triangle is called an end point.
(b) Next, the system performs interpolation calculations to calculate a texture address with respect to each of points of the texture coordinates system (see FIG. 5A) which respectively correspond to the start point and end point of each line segment. Line segments th1 to th6 shown in FIG. 5A are equivalent to the aforementioned line segments h1 to h6 of FIG. 5B which are projected onto the texture coordinates system. So, the system calculates a texture address representing a point of intersection which is formed between each of the line segments th1 to th6 and each side of the triangle consisting of vertices TA, TB, TC.
In FIG. 5B, for example, a start point p of the line segment h4 has a screen address (x1, y1); the vertex SA has a screen address (xa, ya); and the vertex SB has a screen address (xb, yb). In FIG. 5A, the vertex TA has a texture address (txa, tya); and the vertex TB has a texture address (txb, tyb). The start point p1 of the screen coordinates system of FIG. 5B corresponds to a point q1 of the texture coordinates system of FIG. 5A. Herein, the point q1 has a texture address (ty1, ty1) which is calculated by interpolation calculations, as follows: EQU tx1=txb+{(x1-xb)/(xa-xb)}.multidot.(txa-txb) (1) EQU ty1=tyb+{(y1-yb)/(ya-yb)}.multidot.(tya-tyb) (2)
Similar calculations are performed with respect to the start points and end points of the line segments th1 to th6. That is, interpolation calculations are performed using screen addresses of start points and end points of the line segments h1 to h6 and screen addresses of the vertices SA, SB, SC on the screen coordinates system as well as texture addresses of the vertices TA, TB, TC on the texture coordinates system. Thus, it is possible to calculate texture addresses with respect to each of the start points and end points.
(c) On the basis of the texture addresses which are produced by the aforementioned process (b), texture data representing images of the line segments th1 to th6 are read from the texture memory. Then, the read texture data are written into the display memory as image data used for displaying the line segments h1 to h6 on the screen coordinates system.
As a result, texture data representing the interior area of the triangle TA-TB-TC of the texture coordinates system are written into the display memory as image data representing the interior area of the triangle SA-SB-SC.
The aforementioned processes are performed with respect to each portion of the display object which is displayed on the screen of the display device. Thus, it is possible to obtain a three-dimensional image, having a reality, whose surface is decorated with textures.
In the conventional technology described heretofore, texture data representing the line segments th1 to th6 of the texture coordinates system should be displayed to accurately match with locations of the line segments h1 to h6 on the screen coordinates system. In general, however, a finite number of pixels are arranged with equal pitches on the screen of the display device. So, the texture data of the line segments th1 to th6 should be displayed using those pixels. For this reason, when the texture data of the line segments th1 to th6 are written into the display memory, a rounding process such as a round-off process is effected on the screen addresses. Due to the rounding process, values of the screen addresses are converted to integers. Those integers of the screen addresses are used as write addresses for the display memory. Due to rounding errors of the rounding process, actual display locations of the line segments th1 to th6 of the texture coordinates system deviate from desired display locations at which the line segments should be displayed on the screen. In other words, a positional gap occurs between the actual display location and desired display location. If such a positional gap occurs in the texture mapping, notches may emerge particularly on straight-line portions of the images displayed on the screen of the display device. This is a problem that clear and fine images cannot be obtained.
Next, an in-depth description will be given with respect to the above problem by using an concrete example. FIG. 6A shows an enlarged image of the line segment h4 shown in FIG. 5B, wherein circles (.smallcircle.) disposed on the line segment h4 indicate locations of pixels. Herein, x-coordinate values of addresses representing a start point p1 and an end point p2 of the line segment h4 are calculated by the aforementioned interpolation calculations. Those x-coordinate values deviate from x-coordinate values of the pixels which are integers.
Now, texture data should be obtained with respect to a certain portion of the line segment h4. In order to do so, it is necessary to calculate a texture address with respect to the certain portion of the line segment h4, as follows:
FIG. 6B shows a x-ty coordinates system in which a straight line L is drawn to pass through a point (x1, tx1) and a point (x2, tx2). Herein, tx-coordinate values tx1 and tx2 of the above points are x-coordinate values of points q1 and q2 of the texture coordinates system (see FIG. 5A) which respectively correspond to the start point p1 and end point p2. They are calculated by the aforementioned equations of the interpolation calculations. Using equations of the interpolation calculations regarding the straight line L, it is possible to calculate a tx-coordinate value for a texture address of a point of the line segment th4 which corresponds to a point of the line segment h4 arbitrarily selected. So, the calculations are started when a specific point of the line segment h4 is given. Herein, projection is made between the line segment h4 and the straight line L with respect to the specific point. Thus, it is possible to read a tx-coordinate value from the straight line L on the texture coordinates system in response to a x-coordinate value of the specific point. Using such a principle, it is possible to obtain a texture address with respect to a specific point which is selected from among points constructing the line segment h4. Incidentally, the above principle is explained using FIGS. 6A and 6B with respect to the method to obtain a tx-coordinate value of the texture address only. However, a ty-coordinate value of the texture address can be obtained as well in a similar way.
Using the above interpolation calculations, it is possible to obtain a texture address (tx, ty) with respect to a necessary point which is selected from among points constructing the line segment th4 on the texture coordinates system. Thus, texture data are sequentially read from the texture memory in accordance with the texture addresses which are sequentially calculated as described above. Then, the texture data are written into the display memory as image data which correspond to points constructing the line segment h4 on the x-y coordinates system.
The writing of the image data into the display memory is performed using "integer" addresses which correspond to integers representing screen addresses corresponding to points constructing the line segment h4. As a result, each texture data is assigned to a pixel which may deviate from a desired display location at which each texture data should be originally displayed. Such a deviation in displaying will be described below in conjunction with FIGS. 6A and 6B.
Originally, texture data corresponding to a texture address (tx1, ty1) should be written into the display memory as image data corresponding to the start point p1. However, the conventional technology uses only an integer of a screen address of the start point p1. So, the above texture data corresponding to the start point p1 is written into the display memory as image data corresponding to a pixel P1 which is located in proximity to the start point p1 (see FIG. 6A). Such a deviation occurs on the end point p2 as well. That is, texture data which should be assigned to the end point p2 is assigned to a pixel P2 which is located in proximity to the end point p2. Similarly, other texture data corresponding to other points of the line segment h4 which are located between the start point p1 and end point p2 are assigned to pixels deviating from desired display locations at which they should be originally displayed.
The conventional technology employs the rounding process to round up addresses when effecting the texture mapping, so texture data are subjected to mapping to wrong display locations which deviate from desired display positions by rounding errors. In short, the conventional technology suffers from a problem that positional deviations occur on display locations for displaying texture data on a screen of a display device due to the rounding of addresses.