Data may be temporarily stored in a cache in order to speed up future requests for frequently accessed data, as opposed to accessing a slower device or more remote device where a complete copy of the data may be stored. For example, the data stored in the cache may include values that have been previously computed. In response to receiving a request for data stored in the cache, the cache may be read to access the requested data. When the data requested is not stored in the cache, the data may be recomputed or retrieved from an original storage location, which may involve additional retrieval time compared to accessing data stored in the cache. Therefore, the overall system performance may increase when data requests can be served by accessing the cache.
The cache may include a pool of cache entries, with each cache entry in the pool having a piece of data. In addition, a copy of each cache entry may be stored in the original storage location. A cache client may desire to access a cache entry. The cache may determine whether the requested cache entry is present (i.e., a cache hit). For example, a web application may check its cache to determine whether the cache includes data requested by a web application user. In response to a cache hit, a cache may proceed to provide the cache entry to the client.
When the cache determines that the requested entry is not stored on the cache (i.e., a cache miss), the requested entry may be loaded from the original storage location. In addition, a previously uncached entry may be copied into the cache for future requests involving the cache entry. After a cache miss, the system may remove an existing cache entry before adding the previously uncached entry to the cache (i.e., a replacement policy). The replacement policy may remove the existing cache entry according to a least recently used (LRU) method or a least frequently used (LFU) method.