The present invention relates to streaming media. More particularly, the present invention relates to methods and apparatus for serving of streaming media with increased performance.
Typical file caching methods include a cache receiving a file from a file server, and storing the entire file. Later, when a client desires the file, instead of serving the file from the file server, the file is served from the cache. Because the cache is typically a server that is closer to the client or has higher bandwidth than the file server, the file is served to the client quickly from the cache.
It has been discovered by the inventors, that attempting to apply typical file caching methods to files that include streaming media data, raises many new problems. For instance, serving a streaming media data file from a cache requires much more processing by the cache than with classical file transfers over the web. For example, during normal playback, the cache may need to perform a significant quantity of processing such as packet modifications, packet resequencing, packet retiming, packet assembly and other computationally intensive functions. As another example, the cache may be called upon to perform random access within the streaming media data file as a result of a client “rewind” or “fast forward” operation. Because, classical caching is typically file-based, such a random access would involve moving within a very large data file.
Another drawback is that since streaming media data files are very large, a huge penalty is incurred if the streaming media data file is deleted. Typically if a file cache determines that it needs more disk space for new files, it will first delete older files, regardless of the size. As an example, if an older file is a streaming media data file that stores an hour-long program, the entire hour-long program is deleted even if the cache only needs to free up the equivalent of 1 minute of space.
Another drawback is that many different streaming media formats exist, each with its own specific streaming requirements. Thus in contrast to classical file transfer over the web, where the files are essentially opaque to the file cache and for streaming data to clients, a streaming media cache needs to process the actual contents of the file beyond mere storage and retrieval.
An additional drawback is that typical cache retirement schemes are often not suited for streaming media data. In particular, schemes where locations within a fast cache memory are retired and replaced with “fresh” data may degrade performance of a streaming media cache. As an example, one cache retirement scheme includes removing data from a fast cache memory if the data is not used or seldom used. In contrast, it has been discovered by the inventors that data identified by such a retirement scheme represents data that will probably be required in the future. In other words, such a retirement scheme would delete data that is needed. In another example, a FIFO retirement scheme might be used in a traditional cache, i.e. the oldest data is slated for retirement. However, it has been discovered by the inventors that such a retirement scheme may delete data from the fast cache memory that could be advantageously provided to other clients from the fast cache memory.
In light of the above, what is required are improved methods and apparatus for serving streaming media to client systems with increased performance capabilities. Further, what is required are methods and apparatus for providing such solutions in economical ways.