Cache is memory that temporarily stores frequently accessed data. Once the data is stored in the cache, subsequent accesses cause the data to be retrieved from the cache as opposed to re-computing the original data or fetching it from a slower memory location. Caching enables data to be more quickly accessed to lower average access times.
Distributed storage systems use local caches to store remotely retrieved data. After a data request, a determination is made as to whether the data is located in a cache local to the requesting application. If the data is not in a local cache, then the data is retrieved from a central server or remote database. Accessing data from these locations is relatively slow, especially if the data is retrieved from a disk array or a different geographical location.
Storage systems can benefit from new uses of caching and fetching data to decrease access times to data.