Direct path 3D audio is used to render sound sources in 3 dimensions to a listener. In addition to simulating the sources themselves, a more realistic experience may be provided to the user by also simulating the interaction of the sources with the objects in a virtual environment. Such objects may occlude certain sound sources from the listener and also may reflect sound sources. For example, sounds originating in a virtual room would sound differently to a listener depending on the size of the room and sounds originating in an adjacent room would sound differently depending on whether such sounds were occluded by a wall or were transmitted via a path through an open door. In an open environment, reflected sounds from other objects affect the perception of a listener. The familiar experience of hearing the reflection by passing telephone poles of sounds from a car in which a passenger is traveling is an example of such reflections. Rendering such sounds dynamically would greatly enhance an acoustic virtual reality experience for a listener.
FIG. 1 is a block diagram illustrating a virtual scene in which a listener is located in a virtual environment that includes sound sources. A listener 100 is located in a chamber that is defined by walls 102a, 102b, 102c, and 102d. A sound source 104a and a sound source 104b generate sounds that may be heard by the listener. An obstacle defined by walls 106a, 106b, 106c, and 106d is located between listener 100 and sound source 104b. Wall 106d includes a door 108a and wall 106b includes a door 108b. The doors may be either open or closed and may affect whether or not walls 106d and 106b block sound from source 104b from reaching listener 100. A solid object 110 is also located within the chamber.
The walls of the chamber may reflect the sounds generated by the sound sources, creating echoes and reverberations that create for the listener a perception of the spaciousness of the room. In addition, objects in the room may also reflect sound from the sound sources or occlude the sound sources, preventing sound from reaching the listener or muffling the sound.
The listener's perception of such a virtual acoustic environment could be greatly enhanced if the relative position of the listener, the sources, the objects, and the walls of the chamber could be dynamically modeled as the listener, the sources, and the objects move about the virtual chamber as a result of a simulation or game that is running on a computer. However, current systems do not provide for real time dynamic acoustic rendering of such a virtual environment because of the processing demands of such a system.
Prior systems either have required off line processing to precisely render a complex environment such as a concert hall or have modeled acoustic environments in real time by relatively primitive means such as providing selectable preset reverberation filters that modify sounds from sources. The reverberations provide a perceived effect of room acoustics but do not simulate effects that a virtual listener would perceive when moving about an environment and changing his or her geometrical position with respect to walls and other objects in the virtual environment. Acoustic reflections have been modeled for simple environments such as a six sided box, but no system has been designed for dynamic acoustic rendering of a virtual environment including multiple sources and moving objects in real time.
A method is needed for efficiently rendering reflections and occlusions of sound by various objects within a virtual environment. Ideally, such a method would require a minimum of additional programming of the application simulating the virtual environment and would also minimize processing and memory requirements.