1. Field of the Invention
The present invention relates to a vehicle system for spatialized audio playback based on the location of the vehicle. The present invention relates to the presentation of sound in a vehicle where it is desirable for the listener to perceive one or more sounds as coming from specified three-dimensional spatial locations and the presentation of the sound is determined based on the location of the vehicle.
2. Description of Related Art
Consumers continually demand increased access to information, especially while in their vehicles. The anticipation of providing a wider array of in-vehicle “infotainment” options for drivers has resulted in more sophisticated sound systems, i.e. seven to eleven speaker arrays and “intelligent” equalization features; increased information bandwidth, i.e. multi-channel XM® satellite radio and Onstar® cellular data link; and telematics systems, i.e. GPS-based navigation systems. Of course, automobile consumers are not the only ones interested in increased access to information. The military also has needs for navigation assistance within an immersive battlefield visualization; damage control assistance (locating problems); and object detection and tracking.
Current systems, such as the system disclosed in U.S. Pat. No. 5,767,795, enable information to be presented to an operator of a vehicle using either a video or audio clues that are presented based on the location of the vehicle. However, the visual clues may be a distraction to the operator of the vehicle and cause the operator to move his attention from the roadway and other vehicles to the visual clues, thereby causing an increase risk to the safety of the driver. In addition, the audio clues are limited to those provided by a data storage means such as a CD-ROM. Further, the audio clues provided in vehicle systems today do not utilize various audio components within the vehicle to provide additional information.
The related art also includes the following:    1. U.S. Pat. No. 5,491,754: Method and system for artificial spatialization of digital audio signals. This system lays the groundwork for synthetically spatializing audio using multisource signal delays. It does not address geo-coded audio or the use of such in a vehicle.    2. U.S. Pat. No. 5,521,981: Sound Positioner. A system for presenting binaural sound to a listener with the desired effect of the perception of the sound coming from specified three-dimensional spatial locations. The spatial positioning parameters are adjustable in real time but do not involve geo-coded locations of interest as used in a vehicle.    3. U.S. Pat. No. 5,757,929: Audio interface garment and communication system for use therewith. This system utilizes user wearable arrays of microphones and speakers and digital transceivers. The system provides for spatialized audio output to nearby recipients and/or listening in to audio coming from selected directions and/or peers. It does not address geo-coded audio or the use of such in a vehicle.    4. U.S. Pat. No. 5,767,795: GPS-based information system for vehicles. This system enables information to be presented to a driver using either a video display or audio. It does not address the issue of spatialized playback based on location.    5. U.S. Pat. No. 5,642,285: Outdoor movie camera GPS-position and time code data-logging for special effects production. This system enables post-production use of position with video for special effects and animation. It does not address the problem of correct playback based on current location or three dimensional audio capability.    6. U.S. Pat. No. 6,060,993: Mobile display system. This system enables a mobile display of a message to update based on position (e.g. for advertisement), but does not teach spatialized audio with location.    7. Azuma, R., Y. Baillot, R. Behringer, S. Feiner, S. Julier, B. Maclntyre. “Recent Advances in Augmented Reality,” IEEE Computer Graphics and Applications vol. 21, #6 (November/December 2001) pp. 34-47.    8. Feiner, S., B. Maclntyre, and T. Höllerer. 1999. Wearing it Out: First Steps Toward Mobile Augmented Reality Systems, In: Y. Ohta and H. Tamura (eds.): Mixed Reality: Merging Real and Virtual Worlds, Ohmsha (Tokyo)—Springer Verlag, pp. 363-377, http://wwvv.cs.ucsb.edu/˜holl/pubs/feiner-1999-ismr.pdf.    9. Scott-Young, S., “Seeing the Road Ahead,” GPS World, Nov. 1, 2003.    10. Kyriakakis, C., “Fundamental and Technological Limitations of Immersive Audio Systems,” IEEE Proceedings, vol. 86, pp. 941-951, 1998.