1. Field of the Invention
This invention relates generally to content rendering, and more particularly to content rendering across a network.
2. Description of the Related Art
As the value and use of information continues to increase, individuals and businesses seek additional ways to process and store information. One option available to users is information handling systems. An information handling system generally processes, compiles, stores, and/or communicates information or data for business, personal, or other purposes thereby allowing users to take advantage of the value of the information. Because technology and information handling needs and requirements vary between different users or applications, information handling systems may also vary regarding what information is handled, how the information is handled, how much information is processed, stored, or communicated, and how quickly and efficiently the information may be processed, stored, or communicated. The variations in information handling systems allow for information handling systems to be general or configured for a specific user or specific use such as financial transaction processing, airline reservations, enterprise data storage, or global communications. In addition, information handling systems may include a variety of hardware and software components that may be configured to process, store, and communicate information and may include one or more computer systems, data storage systems, and networking systems.
As the digital home begins to unfold, one of the predominant architectures for media consumption and distribution includes an information handling system configured in the form of a media server, which is the repository for digital content (e.g. a piece of digital audio content such as a piece of content stored as MPEG Layer 3 (“MP3”) files and wave format (“WAV”) files), and media nodes which request the digital content be transferred from the media server over a home network and then rendered locally on the media nodes. Media servers may take the form of a personal computer (“PC”), dedicated media server device, dedicated storage device, or more traditional consumer electronics device. The media node may be a dedicated box meant to bridge this new media architecture to a more traditional consumer electronics architecture, may be embedded within a piece of consumer electronics, or may be a handheld or portable device within the network. One example of media node is a digital audio receiver (“DAR”) which renders digital content and outputs it as analog signals. Using this technology and architecture, the consumer can store, organize, and retain content on one or more storage devices, and access the content in the environs in which they wish to consume the content. This allows the content management and storage to be separate from the content consumption.
A common scenario that occurs during the consumption and distribution of media over home network architectures is the request that a piece of content or collection of content be rendered simultaneously on multiple media nodes in multiple rooms within a home. Social gatherings, parties, and simply doing housework while listening to music are examples of situations where such synchronized rendering would be desirable. However, when content is rendered simultaneously on multiple media nodes, each rendering node may fetch its content independently and start playing at different times. In addition, each node has its own unique internal latency characteristics and the network infrastructure and server introduce additional unpredictable delays. This typically results in relatively large playback delays between nodes or the occurrence of other anomalies such as echoes, beating, or acoustic cancellation, thereby minimizing the enjoyment value that may be achieved using multiple rendering locations.
Existing audio and video synchronization technologies and solutions fall into several categories. One category of solutions transmits a continuous clock or periodic synchronization signal, to which a locally generated clock is synchronized or from which a local clock is generated. These systems synchronize time base of one system to time base of another system. These systems are continuous synchronization systems, designed to keep two or more systems in constant time base synchronization. Solutions in this category include, International Engineering Consortium (“IEC”) IEC958, Audio Engineering Society/European Broadcast Union (“AES/EBU”) Sony/Philips Differential Interface (“S/PDIF”) clock recovery and synchronization, frequency shift keying (“FSK”) based clock recovery and synchronization, and Video Genlock systems which use signal event synchronization (e.g. black burst, color burst, H-sync) and master clock synchronization. However, using a time base synchronization system is generally overly burdensome for the synchronization of single events between multiple nodes (e.g., multiple PC's and/or other devices) of a home network system.
Another category of synchronization solution is absolute system time synchronization, which involves the generation and synchronization of an absolute system time in hours, minutes and seconds in order to synchronize events. Using such a solution, the absolute time is transmitted to each system. Examples of these systems are Society of Motion Picture and Television Engineers (“SMPTE”) Longitudinal TimeCode and SMPTE Vertical Interval Time Code which use a string of 80-bits to relay the absolute time information. These solutions are designed to keep multiple systems synchronized to an absolute system time base such that a string of events can be in time synchronization, and are not designed specifically to synchronize single events, such as a single piece of content playback.
Another category of synchronization solution is universal time synchronization, designed to set time clocks such that they are synchronized to a national or international time standard. In this way, systems can use real time to synchronize and log events. These systems often use Universal Time Code (UTC) and Network Time Protocol (NTP) based solutions to transmit time information over TCP/IP networks, and synchronize multiple systems. Both client server architectures and Multi-cast solutions are employed with NTP. This type of system is a continuous time based synchronization system, not designed specifically to synchronize single events.
Dedicated, proprietary RF one to many broadcast systems are used to deliver synchronized streams. With such a one to many system, synchronization is assured. However, this type of system cannot then support unique audio or video streams being sent to individual nodes. Delivery of unique audio streams to individual nodes is the most common type of content delivery over home networks, e.g., where different people want to listen to unique music in different parts of the house.
Audio/video (“A/V”) synchronization systems also exist which embed time stamps into packetized streams for synchronized audio and video decoding and presentation, such as MPEG2 and MPEG4 DTS (Decode Time Stamp), PTS (Presentation Time Stamp) and SCR (System clock Reference) data. These embedded stamps function to provide synchronized audio and video content from a single stream for a single rendering system, and do not function to provide synchronization of two identical audio streams across two rendering systems.