1. Field of the Invention
The present invention relates to an information processing technique for controlling fetching and prefetching with respect to a cache storage apparatus for data in memory.
2. Description of the Related Art
While speeds of processing apparatuses, and DRAM, which is often used as a main memory, have improved due to improvements in the integration of semiconductors, the improvement in speeds of DRAM has been smaller than the improvements of speeds of processing apparatuses. In order to cancel the gap in speeds between the two, cache memory, which is of low capacity but is high speed has been arranged between a high speed processing apparatus and a low speed main memory.
Regarding cache memory, there are many cases in which, when a memory access is performed, contents of a main memory are first copied into the cache memory; in order to suppress a memory latency, a cache prefetch is performed in which contents of the main memory are copied into the cache memory in advance for an address range to be used.
A prefetch, for a fetch which is for actually performing processing, performed as much as possible in advance can suppress memory latency. However, if the prefetch is performed too far in advance, there is a problem in that the data that is prefetched to be used thereafter will be replaced with data that is newly prefetched.
In Japanese Patent Laid-Open No. H10-320285, a lock bit is set when data that is a prefetch target is stored so that the data that is prefetched is not replaced prior to being read out. Then, when the prefetched data that was locked by the lock bit is read out, the lock bit is canceled. With this, configuration is taken such that data that is prefetched and has not been used even one time is not replaced. However, with this approach, the problem still remains that, for data that will be used two or more times, because the data become a replace target when the lock is cancelled at the point in time when it is used one time, the data that is prefetched to be used thereafter will be replaced by data that is newly prefetched.