1. Field of the Invention
The invention relates to a cache device and a method of using the same for data accesses, and in particular to a cache device having an improved efficiency by prefetching and storing address data using a prefetch queue comparing circuit.
2. Background of the Related Art
Computers which are widely used in line with a great progress in semiconductor technology have brought a great change into out life for dozens of years. Now, some companies have successfully produced central processing units (CPUs) with an operating clock of several hundred MHz. Unfortunately, not all devices, such as memory devices, included in a computer system can operate with an operating clock the same as that of CPUs. Although the frequency of the operating clock of CPUs is continuously increased, the access speed of dynamic random access memories (hereinafter referred to as DRAMs) is not greatly improved. To resolve this problem, a cache device is introduced. That is, in a computer system, DRAMs serve a primary memory while static random access memories (SRAMs) serve as a cache device. With such a cache device, data to be likely requested by a CPU is previously transferred from the primary memory to the cache device. In this case, the CPU can access the higher-speed cache device directly instead of the primary memory, thereby reducing data access time. Therefore, a better balance between costs and efficiency can be reached. However, since the cache device has a data memory capacity smaller than that of the primary memory, data required by the CPU may not be totally stored in the cache memory. If data requested by the CPU are exactly stored in the cache memory, this state is called "cache hit" and can allow the data to be accessed with less time taken by the CPU. Inversely, if data requested by the CPU are not stored in the cache memory, this state is called "cache miss." At the "cache miss" state, the CPU has to access the required data through the primary memory with more time taken. The ratio of the "cache hit" and "cache miss" indicates "hit ratio."
Referring to FIG. 1, a conventional cache device according to the prior art is shown. In FIG. 1, a cache device 100 mainly consists of a cache memory 110 and a cache control circuit 120. The cache control circuit 120 is responsible for the entire operation of the cache device 100 by controlling the cache memory 110. The cache memory 110 includes a data RAM 112 and a tag RAM 114. The data RAM 112 stores data corresponding to the primary memory 140 while the tag RAM 114 stores tag addresses corresponding to the data stored.
For detailed description, FIG. 2A illustrates the corresponding relationship between the cache memory 110 and the primary memory 140. As shown in FIG. 2A, the primary memory 140 is divided into several blocks each given with a distinct tag address. Furthermore, the index addresses of each block are the same as those of the tag memory 114 and the cache memory 110, wherein each index address is corresponding to a tag address stored in the tag memory 114 and data stored in the data memory 112 at the same time. Referring to FIG. 2B, the combination of a tag address and an index address represents a corresponding addresses of the primary memory 140. In other words data stored at an index address of the data memory 112 with a corresponding tag address stored in the tag memory 114 is identical to that stored at the same address (consisting of the tag address and the index address) of the primary memory 140. As we know, the cache memory 110 only stores part of data of the primary memory 140. Therefore, it must be determined that whether it is at a "cache hit" state or a "cache miss" state and whether it is necessary to re-transfer required data of the primary memory 140 into the cache memory 110 when the cache device 100 handles with data accesses requested from the CPU. The way to achieve the above-stated determination is that when a data access request is received from the CPU, an address output from the CPU is compared to all tag addresses stored in the tag memory 114 together with corresponding index addresses. If the comparing result shows that one is matched, it represents a "cache hit" state while if no one is matched, it represents a "cache miss" state.
Assume that reference symbol T.sub.WR designates data access time of the cache memory 110, T.sub.MEM designates data access time of the primary memory 140 and R.sub.HIT designates cache hit ratio of the cache device 100. The average data access time T.sub.AV can be expressed by: EQU T.sub.AV =R.sub.HIT (T.sub.WR)+(1-R.sub.HIT)(T.sub.WR +T.sub.MEM) (1)
In equation (1), (T.sub.WR +R.sub.MEM) represents the required access time when the cache device 100 is experienced "cache miss", wherein T.sub.MEM is generally much longer than T.sub.WR. In other words, the required data access time at a "cache miss" state is much longer, resulting in a poor system efficiency.