Computers often use virtual memory and cache memory to improve performance. A cache memory (called a cache) is a high speed memory which stores frequently used data. Cache memories frequently reside on processor chips between the processor and the RAM contained within a computer system. Cache memories are also used in peripheral device controllers, such as a disk controller. A virtual memory system performs a very similar function between the RAM in a computer and slower storage such as a disk storage drive.
In either of these systems, when a unit of data is requested, the system determines whether the unit of data is contained in the high speed area, such as the cache, or within the slower speed area, such as the RAM. If the data is contained within the high speed area, the processor retrieves it directly from the high speed area and thereby avoids retrieving it from the slower area.
Data is retrieved in the following manner. Data is requested with a key. A translation lookaside buffer, contained within the processor, is then searched for an entry that matches the key. If such an entry is found, the translation lookaside buffer returns the actual location of the data in the high speed area. The processor then retrieves the entry directly from the high speed area using this location.
In a virtual memory system, the high speed area is RAM, and the data is retrieved in units called pages. Therefore, the key is a virtual page number (VPN), and the data is contained in a RAM memory page having a real page number (RPN). The translation lookaside buffer contains entries having a VPN and a corresponding RPN. When the translation lookaside buffer is searched, using a VPN, and a match is found, the buffer returns the corresponding RPN from the matching entry. The RPN is then used to access the data within RAM. If a matching entry is not found in the translation lookaside buffer, the data is retrieved from the disk, placed in RAM at some real page location, and an entry containing the RPN of the real page location, along with the corresponding VPN, is then placed into one of the entries of the translation lookaside buffer.
Because the translation lookaside buffer contains a relatively small number of entries, it fills up quickly. Placing an entry into the translation lookaside buffer, therefore, generally requires replacing another entry from the translation lookaside buffer. To maximize performance, entries which are not being frequently used should be selected for replacement.
Translation lookaside buffer systems employ various well known approaches for selecting an entry which will be replaced. Often, each entry within the translation lookaside buffer contains electronic logic circuits which decide whether the entry should be replaced when a new entry is to be inserted. Therefore, logic circuitry in each of the entries sends a signal which determines whether that particular entry is eligible for replacement. More often, the signal indicates that the entry is not available for replacement, and the signal is called an exclude signal. Once each entry has produced an exclude signal, additional logic must select, from the entries that are not being excluded, one of the entries for replacement, while at the same time ensuring that only a single entry is selected.
One of the most common methods for selecting one of the entries for replacement is a sequential search through the entries and selection of the first entry which is eligible for replacement. This method can be very slow if the element selected for replacement is one of the last elements within the translation lookaside buffer.
Therefore, there is need in the art for an improved selection method to select an entry within a translation lookaside buffer for replacement. The present invention meets this and other needs.
This application is related to application Ser. No. 07/726,619 filed Jul. 8, 1991 of Jeffry E. Trull, entitled "Cache memory replacement selector", owned by the same entity, which is incorporated herein by reference for all that is disclosed and taught therein.