Data caches are used in many environments to increase speed and reduce memory bandwidth requirements. In operation, accessing a data item from memory causes the data item to be stored in the cache. Subsequent accesses to the data item may be satisfied directly from the cache, avoiding more costly transfers from memory.
As the cache becomes full, it becomes necessary to clear cache locations to make room for new data. A typical cache management scheme identifies the least recently accessed data items of the cache as candidates to be cleared. Thus, cache locations that have not been accessed recently are more likely to be cleared, while cache locations that have most recently been accessed are given preference for cache retention.
More specifically, this type of cache management policy may be implemented by a scheme referred to as the “Quad-Age” cache management algorithm. Each cached data item is associated with two bits indicating its “age.” The age of a data item may range from 0 to 3, with 0 indicating a least recently used and most vulnerable item for eviction from the cache, and 3 indicating a most recently used item, having high retention priority. The cache locations are then managed in accordance with three policies:                Insertion Age Policy. When a data item is inserted into the cache, it is given an age of “1”.        Hit Promotion Policy. When a data item is subsequently accessed from the cache (referred to as a “hit”), its age is promoted to “3”.        Eviction Policy. In order to find a data item to be cleared, the cache is searched for the first data item having an age of “0”, and that item is cleared. If no such data item is found, the ages of all data items are decremented and the search is performed again. This decrementing and searching is repeated until a data item having an age of “0” is found.        