The background description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent the work is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
Generally, data requested by a processor is initially read from a storage device (e.g., a disk drive) and is stored in a system memory (also called main memory, typically DRAM) used by the processor. The processor then reads the data from the system memory via a system bus to which the processor, the system memory, and other peripherals are connected, and processes the data. Processing the data in this manner can be slow.
Instead, data frequently used by the processor can be stored in cache memories. This process is called caching, and data stored in cache memories is called cached data. The processor can access and process cached data faster than the data stored in the disk drive or the system memory. Accordingly, cache memories can improve system performance and throughput.
Specifically, a cache controller stores the data frequently used by the processor in a cache memory. The cache memory is generally faster than the system memory and may be coupled to the processor by a bus separate from the system bus. Accordingly, the processor can access the data in the cache memory faster than the data stored in the system memory. Caching the data therefore improves system performance and throughput.