1. Technical Field
The present disclosure relates to the field of computers, and specifically to memory storage. Still more particularly, the present disclosure relates to managing cache data.
2. Description of the Related Art
Cache memory (cache) is defined as a local high-speed memory, which is typically located within or near a core of a processor. Because the amount of room within a cache is limited, a cache/memory controller will only store in cache the instructions/data that are needed by the processor. Thus, caches often replace (“evict”) less frequently used blocks of data (hereafter a “cache line” or just “line”) to make room for more recently used data. In most systems the line is either written back to memory (if the evicted line has been modified) or simply replaced by the newer line (if the evicted line has not been modified). Either action is known as “evicting” the line. After the line has been evicted, it may be again needed by the processor and therefore must be brought back into the processor's cache from memory.
Similarly, caches may share cache lines among themselves. This process can be made mandatory, where one cache imposes (“injects”) a cache line on another cache. However, the injection process is fraught with problems and issues, particularly in deciding which cache will be the candidate for injection of an evicted cache line.