A cache memory device is a small, fast memory that is available to contain the most frequently accessed data from a larger, slower memory. Random access memory (RAM) provides large amounts of storage capacity at a relatively low cost. Unfortunately, access to RAM is slow relative to the processing speed of modern microprocessors. Even though the storage capacity of the cache memory may be relatively small, it provides high-speed access to the data stored therein.
The cache is managed, in various ways, so that it stores the instruction, translation, or data most likely to be needed at a given time. When the cache is accessed and contains the requested data, a cache “hit” occurs. Otherwise, if the cache does not contain the requested data, a cache “miss” occurs. Thus, the cache contents are typically managed in an attempt to maximize the cache hit-to-miss ratio.
Devices and operating systems want to flush pages of data out of cache and entries out of TLBs as a part of a maintenance practice.