As a computer achieves higher performance and a network covers a higher bandwidth, a VoD (Video on Demand) service that allows a user to acquire and view a high-quality video content at any time wishing to view it is increasingly wide-spread.
As a delivery system related to the present invention, there is fast streaming that acquires a buffer size up to the start of playback in a burst fashion in a streaming delivery for the purpose of improving the user's quality of experience in such a service (cf. e.g. Patent Document 1).
Further, as a delivery system related to the present invention, there is a technique such as progressive download that downloads a video content using a protocol for bulk data transfer such as HTTP (Hyper Text Transfer Protocol)/TCP (Transmission Control Protocol) and plays back the content in parallel. By such techniques, a delivery system related to the present invention enables reduction of a waiting time until the start of viewing as well as allocating a buffer size that avoids breakdown of viewing.
However, in a best-effort network such as the Internet, due to a significant decrease in throughput, an increase in delay time or the like caused by congestion on a communication path, an enormous waiting time until the start of viewing or breakdown of viewing occurs even with use of such techniques, and there is a possibility that significantly degrades the user's quality of experience.
On the other hand, as a delivery system related to the present invention, a content cache method as described in Patent Document 2, for example, is used to reduce a waiting time until the start of viewing as well as avoiding breakdown of viewing.
In the content cache method described in Patent Document 2, an edge server that caches contents is placed between a user terminal and a central delivery server. When a request is made for a content cached in the edge server, the content is delivered from the edge server, so that a communication distance to the user terminal is shortened to reduce the effect of congestion such as a decrease in throughput.
The mechanism of the content cache method is described hereinafter in detail. The content cache method calculates a parameter indicating the degree of fitness for each of contents registered in the VoD service on the basis of user's preference information, viewing records, service contract details and promotion contract details with a content provider.
Then, the content cache method pre-caches contents into the edge server based on the calculated degree of fitness. When a new content is registered, the method calculates the degree of fitness for the new content, compares it with the degree of fitness of the content cached in the edge server, and replaces the cache.
Because the degree of fitness is a parameter that reflects user's preference or content promotion status for its calculation method, by selecting contents to be pre-cached into the edge server based thereon, it is possible to increase a probability of delivering a content requested by a user from the cache (cache hit ratio). The content cache method can thereby probabilistically reduce a communication distance to a user and reduce the effect of congestion such as a decrease in throughput.
[Patent Document 1]
    Japanese Unexamined Patent Application Publication No. 2006-115477[Patent Document 2]    Japanese Unexamined Patent Application Publication No. 2005-12282