1. Field of the Present Invention
The present invention generally relates to the field of data processing networks and more particularly to a system and method in which portions of system memory are dynamically deactivated based upon the currency of stored data to reduce power consumption.
2. History of Related Art
In the field of data processing system and networks, server devices and server networks or clusters are widely employed to provide web and web application services. Frequently, the servers in a particular cluster are provided in a rack mounted or other dense configuration in which a large number of server devices are located in close proximity to one another. One of the major goals for operators of these servers is to reduce the amount of electricity that they consume and the amount of heat that they generate. Both goals can be met by reducing the number of watts that they dissipate. Typically, such servers make use of large amounts of memory to cache data that has already been requested on the assumption that there will be other requests for the same information. Although the data is these caches is subject to replacement as new data replaces old data, the cache size generally does not vary considerably over time. The data cache frequently represents a substantial portion of the system memory usage because these servers are typically running few if any other applications. During periods of significant network traffic, a large data cache is desirable to maintain an acceptable cache hit rate. The cache hit rate refers to the probability that a requested file or other data is present in the cache. If a data request xe2x80x9cmissesxe2x80x9d in the cache, a time consuming retrieval of the requested data from persistent storage (disk) is required. During periods of low activity, however, it is possible that an acceptable cache hit rate can be maintained with a considerably smaller data cache. It would be desirable to implement a system and method for adjusting the size of the data cache based upon factors such as the amount of network traffic being serviced to maintain the smallest sized cache required to achieve a desired level of performance. It would be further desirable if portions of the system memory could be dynamically and selectively deactivated when not required for use in the data cache to minimize the power consumption and heat dissipation of the system.
The problems identified above are in large part addressed by a data processing network, server, and method in which the performance of a file cache associated with an application program running on the server device is monitored. The file cache is then purged of stale data if the performance exceeds a specified criteria and the cache is re-allocated to occupy a smaller number of system memory sections. One or more sections are deactivated following reallocation if the section no longer contains a portion of the file cache. The file cache performance criteria may include the file cache hit rate. Additional memory may be allocated for the file cache if the specified criteria falls below a specified limit and one or more sections of the system memory may be activated if the additional memory comprises a portion of a previously deactivated physical memory section. Deactivating and activating a system memory section may include switching off and on, respectively, power to the section.