1. Field of the Invention
This invention relates generally to computer generated images and more particularly to a system and method for environment mapping.
2. Description of the Background Art
Typically, the illumination of a computer-generated object by discrete light sources, continuous light sources, and ambient light is described by an illumination model. The object is illuminated by the reflection of ambient light and the reflection of light source light from the surface of the object. Generally, the illumination model is a mathematical expression that operates on a set of variables to generate reflection properties, such as color and intensity of reflected light and an object's texture as viewed by an observer. Given ambient light and light sources positioned about the object, the illumination model defines the reflection properties of the object. The illumination model is considered to be accurate if the illuminated object appears realistic to an observer.
Typically, the illumination model is incorporated in a software program executed by a vector processing unit, a central processing unit, or a rendering engine of a computer system. The program must be capable of computing the illumination of the object when the light sources change position with respect to the object, when the observer views the illuminated object from a different angle, or when the object is rotated. Furthermore, an efficient illumination model is needed for the processing unit to compute the illumination in real-time, for example, if the observer (i.e., a camera) is moving with respect to the object. Therefore, it is desired to incorporate terms in the illumination model that are computationally cost effective, while at the same time generating an image of the illuminated object that is aesthetically pleasing to the observer.
Computing texture (i.e., environment mapping) is important when rendering a realistic image of the illuminated object that closely resembles a real physical object. Typically, texture coordinates for each point of the object's surface are computed, and a texture map comprising the texture coordinates is generated.
FIG. 1 illustrates a prior art direct normal projection method for computing an object's texture coordinates. FIG. 1 includes an object's surface 105, a point P on surface 105, a normal vector n to surface 105 at point P, an observer 110a, a line-of sight 115a between observer 110a and the point P, and a projection of the normal vector n onto an x-axis 120, referred to as nx. In general, a z-axis (not shown) is perpendicular to x-axis 120 and is in the plane of FIG. 1, and a y-axis (not shown) is perpendicular to x-axis 120 and the z-axis and is out of the plane of FIG. 1. For simplicity of illustration, the FIG. 1 embodiment of object's surface 105 is a line, however, surface 105 is typically any 2-D surface, and hence in general, the normal vector n may have a vector component ny along the y-axis.
In operation, the direct normal projection method computes the projected components nx and ny of the normal vector n for each point P on object's surface 105. The central processing unit or vector processing unit then maps (i.e., transforms) the projected components nx and ny into texture coordinates (s,t) using one or more mapping algorithms known in the art. The vector processing unit then uses the computed texture coordinates (s,t) for each point P, as well as other reflection variables, in an illumination model to generate a reflection pattern of object's surface 105. Although the direct normal projection method of the prior art may be fast, the method generates a reflection pattern that appears “painted-on” as observer 110a moves to different locations. In other words, the reflection pattern of object's surface 105 does not change with respect to rotation or translation of observer 110a, since the method depends upon the x and y components of the normal vector n, independent of the position of observer 110a with respect to the point P. For example, the vector processing unit computes the same projected components (nx,ny) and texture coordinates (s,t) for an observer 110b viewing point P as observer 110a viewing point P.
It would be useful to implement a system and method of environment mapping that depends upon an observer's location with respect to an object's location and orientation to generate a more realistic reflection pattern, and that is consistent with results of the direct normal projection method for particular object-observer geometries.