1. Field of the Invention
The present invention generally relates to a cache server and a distributed cache server system and, more particularly, to a cache server and a distributed cache server system which reduce a traffic of an external network.
2. Description of the Related Art
Recently, as computer networks, such as the Internet, have developed, people have come to dispatch various sorts of information to computer networks, as well as browse various sorts of information from computer networks.
In the Internet, for example, large-amount information, such as moving images and music, has been increasingly furnished, and electronic commerce has become popular, partly due to the spread of information technology. Accordingly, the amount of data existing in the Internet is increasing day by day.
Conventionally, in order to deal with such an increasing amount of data, a cache server or a proxy server having a caching function is provided between an internal network such as a LAN (local area network) and an external network such as the Internet. The conventional cache server or the proxy server having a caching function (hereinafter generally referred to as a cache server) reduces the traffic between the internal network and the external network by temporarily caching data sent from the external network upon connection from the internal network to the external network.
FIG. 1 illustrates an example of a conventional cache server. A cache server 101 shown in FIG. 1 is connected with client PCs 102a to 102d, constituting an internal network. For example, when the client PC 102d makes a request to connect to a Web page, the cache server 101 judges whether or not data of the requested Web page is cached. Then, if not cached, the cache server 101 connects to the Internet 100. Subsequently, the cache server 101 sends data of the Web page sent from the Internet 100 to the client PC 102d as well as temporarily caches the data of the Web page.
Meantime, when the client PC 102a makes a request to connect to the Web page, the cache server 101 judges whether or not data of the requested Web page is cached. Then, if cached, the cache server 101 does not connect to the Internet 100, but sends the cached data of the Web page to the client PC 102a. 
FIG. 2 illustrates an example of cache servers provided in a conventional large-scale computer network. In a large-scale computer network as of a corporation, for example, a cache server is provided for each of the head office and branch offices. Cache servers 103a to 103c shown in FIG. 2 are provided respectively for the head office and branch offices, for example, and are connected with client PCs 104a to 104c, respectively, constituting an internal network. The cache servers 103a to 103c perform a process referred to as a mirroring that synchronizes cache data among the cache servers 103a to 103c so as to have the same cache data.
For example, when the client PC 104a makes a request to connect to a Web page, the cache server 103a judges whether or not data of the requested Web page is cached. Then, if not cached, the cache server 103a connects to the Internet 100. Subsequently, the cache server 103a sends data of the Web page sent from the Internet 100 to the client PC 104a as well as temporarily caches the data of the Web page. Additionally, the cache server 103a performs the mirroring of the cached data of the Web page among the cache servers 103a to 103c. 
FIG. 3 illustrates another example of cache servers provided in a conventional large-scale computer network. Cache servers 105a to 105c shown in FIG. 3 are connected with client PCs 106a to 106c, respectively, constituting an internal network. The cache servers 105a to 105c shown in FIG. 3 do not perform the mirroring of data, but each of the cache servers 105a to 105c individually caches data.
For example, the cache server 105a temporarily caches only data of a Web page requested for connection by the client PC 106a. Likewise, the cache servers 105b and 105c temporarily cache only data of Web pages requested for connection by the client PCs 106b and 106c, respectively.
However, the cache servers 103a to 103c shown in FIG. 2 cause the amount of cache data to increase by retaining the same cache data among the cache servers 103a to 103c by mirroring. Therefore, the cache servers 103a to 103c have a problem that a recording medium such as a hard disk suffers a heavy load from the increasing amount of cache data. Additionally, because the amount of cache data remarkably increases, the cache servers 103a to 103c are unable to operate effectively in a large-scale computer network.
On the other hand, the cache servers 105a to 105c shown in FIG. 3 do not perform the mirroring of data, but each of the cache servers 105a to 105c individually caches data. Therefore, the amount of cache data decreases. As a result, there are more cases that the cache servers 105a to 105c have not cached data that is requested for connection by the client PCs 106a to 106c, and thus the cache servers 105a to 105c more often connect to the Internet 100. Therefore, the cache servers 105a to 105c are unable to reduce the traffic of an external network effectively.
It is a general object of the present invention to provide an improved and useful cache server and a distributed cache server system in which the above-mentioned problems are eliminated.
A more specific object of the present invention is to provide a cache server and a distributed cache server system in which, while the amount of cache data retained by each cache server is decreased, the traffic of an external network can be reduced.
In order to achieve the above-mentioned objects, there is provided according to one aspect of the present invention a cache server provided in an internal network, the cache server comprising:
a cache-data-list table storing information concerning data retained by other cache servers on an individual cache server basis; and
a cache-data-list administrating unit receiving a data-inquiry request from one of the other cache servers, and searching the cache-data-list table for cache-server information indicating another one of the other cache servers retaining data regarding the data-inquiry request so as to send the cache-server information to the one of the other cache servers.
According to the present invention, the cache server can administrate information of data retained by other cache servers so that the other cache servers do not have to retain the same data as one another. This reduces the amount of cache data as a whole so as to lessen the task load imposed on a recording medium such as a hard disk. Furthermore, the other cache servers can obtain data from one another so as to alleviate the traffic of an external network.
Additionally, the cache server according to the present invention may further comprise:
a relative-position information table storing information indicating a distance between each two of the other cache servers; and
a control-data sending and receiving unit obtaining a distance between the one of the other cache servers and the said another one of the other cache servers from the relative-position information table so as to order the one of the other cache servers to retain the data regarding the data-inquiry request depending on the distance.
Additionally, in the cache server according to the present invention, the control-data sending and receiving unit orders the one of the other cache servers to retain the data regarding the data-inquiry request when the distance is larger than a predetermined distance.
According to the present invention, when one of the other cache servers makes a request to obtain data retained by another cache server that is several cache servers away therefrom, the one of the other cache servers can cache the data in itself so as not to make another request from next time on.
In order to achieve the above-mentioned objects, there is also provided according to another aspect of the present invention a distributed cache server system provided in an internal network, the distributed cache server system comprising:
an owner cache server; and
member cache servers, each of which retains data from an external network, and sends information concerning the data to the owner cache server so that the owner cache server stores the information so as to search for cache-server information therefrom indicating one of the member cache servers retaining data regarding a data-obtain request received from within the internal network,
wherein, when the owner cache server finds the cache-server information, another one of the member cache servers obtains the data regarding the data-obtain request from the one of the member cache servers, and
when the owner cache server does not find the cache-server information, the said another one of the member cache servers obtains the data regarding the data-obtain request from the external network.
Additionally, in the distributed cache server system according to the present invention, the owner cache server may include:
a cache-data-list table storing the information; and
a cache-data-list administrating unit receiving a data-inquiry request from another one of the member cache servers, and searching the cache-data-list table for the cache-server information indicating the one of the member cache servers retaining data regarding the data-inquiry request so as to send the cache-server information to the said another one of the member cache servers.
Additionally, in the distributed cache server system according to the present invention, each of the member cache servers may include:
a cache-data DB retaining the data from the external network; and
a cache-data sending and receiving unit sending information concerning the data retained from the external network to the owner cache server.
koko
According to the present invention, the owner cache server can administrate information of data retained by member cache servers so that the member cache servers do not have to retain the same data as each other. This reduces the amount of cache data retained in the distributed cache server system so as to lessen the task load imposed on a recording medium such as a hard disk. Furthermore, the member cache servers can obtain data from each other so as to alleviate the traffic with an external network.
Additionally, in the distributed cache server system according to the present invention, the owner cache server may include:
a relative-position information table storing information indicating a distance between each two of the member cache servers; and
a control-data sending and receiving unit obtaining a distance between the one of the member cache servers and the said another one of the member cache servers from the relative-position information table so as to order the said another one of the member cache servers to retain the data regarding the data-inquiry request depending on the distance.
According to the present invention, when the said another one of the member cache servers makes a request to obtain data retained by the one of the member cache servers that is several cache servers away therefrom, the said another one of the member cache servers can retain the data in itself so as not to make another request from next time on.
Additionally, in the distributed cache server system according to the present invention, each of the member cache servers may include:
a keyword-information table storing keywords; and
a cache-data administrating unit counting a number of times each of the keywords appears in the data obtained from the one of the member cache servers,
so that the said another one of the member cache servers retains the data obtained from the one of the member cache servers in the cache-data DB depending on the number of the times.
According to the present invention, when the said another one of the member cache servers makes a request to obtain frequently requested data containing the keywords from the one of the member cache servers, the said another one of the member cache servers can retain the frequently requested data in itself so as not to make another request for the frequently requested data from next time on.
In order to achieve the above-mentioned objects, there is also provided according to another aspect of the present invention a distributed cache server system provided in an internal network, the distributed cache server system comprising:
groups of a cache server and other cache servers, the groups being configured in a hierarchical structure, each of the other cache servers retaining data from an external network, and sending information concerning the data to the owner cache server in each of the groups so that the owner cache server stores the information so as to search for cache-server information therefrom indicating one of the other cache servers retaining data regarding a data-obtain request received from within the internal network,
wherein, when the owner cache server finds the cache-server information, another one of the other cache servers in each of the groups obtains the data regarding the data-obtain request from the one of the other cache servers, and
when the owner cache server does not find the cache-server information, the said another one of the other cache servers in each of the groups obtains the data regarding the data-obtain request from one of the external network and the others of the groups.
According to the present invention, the cache servers can be configured in a hierarchical structure so that information of cache data retained in the cache servers is administrated effectively according to the hierarchical structure. Consequently, this reduces the amount of cache data retained in the distributed cache server system which then can be applied more effectively in a large-scale computer network.
Other objects, features and advantages of the present invention will become more apparent from the following detailed description when read in conjunction with the accompanying drawings.