1. Field of the Invention
The present invention relates to a cache control apparatus for a microprocessor.
2. Description of the Related Art
With the trend of semiconductor technologies toward miniaturization, higher frequencies and higher degree of integration, the performances of microprocessors have remarkably improved. To make the most of the performance of the microprocessor in this situation, a high-speed cache, with large capacity and a high hit ratio is required.
A conventional cache control apparatus generally has a hierarchical structure with a small-capacity cache built in the processor and a large-capacity cache attached as an external unit. Also, with the built-in cache, optimization by the copy-back method is used with the aim of providing an improved hit ratio by using multiple ways (circuit path).
In the future, a higher hit ratio of the external cache will come to be required. For improving the hit ratio, it is effective to increase ways number. The direct map system in which the number of ways is regarded to be unity, on the other hand, has simple hardware and a high access speed.
The external cache is configured.with a combination of a RAM on a CPU module. The simple fabrication of multiple ways, therefore, requires more address lines, data lines and control lines.than the system with fewer ways, and makes the pattern design of the module difficult. The increased number of-pins of the processor is also a serious problem. Further, the access time cannot be guaranteed and, in the case where prediction fails, a delay results.
An object of the present invention is to provide a cache control apparatus for an information processing system comprising a cache memory having a plurality of ways, which can have multiple ways without increasing the hardware amount.
Another object of the invention is to provide a cache control apparatus for an information processing system having a cache memory having a plurality of ways, in which high-speed access can be guaranteed and, in the case where the last way is hit or even in the case of a cache miss, a response is made possible with minimum delay.
The present invention has been developed to achieve the objects described above.
According to the present invention, there is provided a cache control apparatus for an information processing system comprising a cache memory having a plurality of ways, in which a cache tag memory and a cache data memory are indexed with the cache index and a way as an address and, at the time of cache access, each way is indexed by time division while, at the time of updating the cache tag or cache data, the way to be updated is designated for updating the cache tag and the cache data, the apparatus further comprising a buffer for recording for each way the data of the cache tag indexed by time division, means for holding the cache tag data of all the ways until they are completely acquired, and means for making a hit judgment for all the ways at the same time.
According to this invention, a hit judging unit can judge a hit or not each time the cache tag data is indexed by time division.
According to this invention, the data are read by time division at the time of indexing the cache tag or the cache data, and therefore the hardware amount can be reduced.
Also, the apparatus according to this invention can comprise means for always responding to the reader with the cache data in a fixed way and means for producing a cancel signal in the case of a cache miss in a fixed way and responding with the data of another way which may be hit. According to this invention, the cache is indexed by time division, and the indexed data are sequentially transferred to the reader. In the case where the previously sent data is a cache miss, a cancel signal is output and a response signal is sent for the data in a way hit. Thus, even for multiple ways, the access time delay can be minimized.
The apparatus according to this invention can further comprise means for recording the history of the hit ways, means for predicting from the history a way which may be hit, and means for responding with the cache data in the predicted way, wherein the responding means can include means, in the case of a cache miss in the predicted way, for responding again with the data of a particular way. According to this invention, a way with high probability of a hit is predicted and the tag is indexed, and therefore the access time can be reduced.
The present invention is also applicable to the internal cache as well as to the external cache.