This invention relates to time-based processes and data and, more specifically, to an approach for maintaining or synchronizing one or more time bases in one time-based process in relation to one or more clocks or time bases in another time-based process.
Time-based data such as video sequences, audio sequences, financial results, and laboratory data recorded over time may be represented in the metaphor of a “movie.” Examples include QUICKTIME movies and movie objects from APPLE Inc. of Cupertino, Calif. Such time-based data may be consumed and played or displayed by an application or process running in a computing environment or operating system of a device or among several devices that are connected via a communications network.
A common need with regard to time-based data, such as a movie object with video, audio, and closed captioning, is that all elements of the movie need to be synchronized. For example, the audio needs to match the video, so that an actor's lips match the words that are being spoken, and the closed captioning needs to match the words being spoken. This synchronization needs to remain consistent in the event the movie data is fast forwarded, paused, reversed, or stopped and then restarted at a point existing later in the movie.
One approach to controlling timing issues in movies and synchronization of data within the movies is the use of a clock and time bases derived from the clock. In general, clock components are always moving and derive their timing information from some external source; for example, an audio clock based on a crystal timing mechanism, to provide the basic timing. Time bases describe the context of time in the currently-playing movie; for example, where the current position of the movie is compared to the entirety of the movie data. Under one approach, time bases rely on either a clock component or another time base for their time source.
Using this approach, time can be converted from one time base into a time that is relative to another time base, but only if both time bases rely on the same time source, such as a clock driven by one specific audio device. For example, in order to synchronize audio and video data, current approaches have the audio time base and video time base rooted in the audio clock. This is because while frames may be dropped from video playback or sent to a display at a higher or lower rate without perceptibly changing the viewing experience, it is more difficult to play audio faster or slower without changing the listening experience.
While the audio and video time bases are independent, it is because they are rooted in the same time source that this manipulation of the time bases is made possible. Current approaches use algebraic manipulation to derive relationships between independent time bases rooted in the same time source.
Current approaches are unable to derive accurate relationships between time bases rooted in different time sources. Therefore, under prior approaches, when there are independent audio and video clocks, synchronization is not possible. Another example is when audio is recorded into a computer, the audio data may be clocked by a crystal clock on the sound card, and not all crystal clocks run at exactly the same speed. In a system with two or more sound cards, each sound card having a clock, when recording via one sound card and playing back via another, the playback speed may be slightly different. Although both sound cards may nominally be clocking data in and out at exactly 44.1 kHz, for example, in fact they will have fractionally different sampling rates because they have fractionally different clock speeds.
This leads to problems in a system where one clock cannot be designated as the master clock; for example, synchronizing audio to an external digital timing clock. Because the external clock may not run at exactly the same speed as the clocks on the sound cards, drift between the tracks may be introduced, and because no approach exists to define accurate relationships between time bases based on disparate clocks, the drift cannot be rectified. Similar problems arise when audio is stable but one desires to synchronize audio data and video data, each with an independent time base rooted in a common clock, to an outside source, such as a MIDI sequencer. The MIDI sequencer has its own clock, and if the master device runs a little slow, the MIDI sequence tempo will drop to match it, but the audio and video may carry on at their original rates, causing the MIDI to drift relative to the audio and video.
Under certain conditions, a first application or process may utilize a clock, such as an audio clock, to control the timing of data handling while a second application or process may need to have access to information about the same audio clock in order to interact with the first process. For example, a device may employ a media playback process that interacts with a user interface (UI) process. The UI process allows a user to control the playing of media (e.g., audio and video) by the media playback process.
One problem is that the UI process may not typically have the ability to access the audio clock or a related time base and, therefore, must request audio clock or time base information continuously from the media playback process in order to synchronize the UI process with the media playback process. Such continuous remote procedure calls or requests for timing information from the media playback process can consume significant processing power that can affect system response times or result in excessive battery usage in, for example, a portable media device.
Accordingly, there is a need to enable one time-based process to synchronize media handling or processing with at least one other time-based process without the need for excessive communications between the processes that may adversely impact the performance of a media device or other media processing system.