Caches are commonly used in computer system. A cache is a collection of data duplicating original values stored elsewhere or computed earlier, where the original data is expensive to fetch (due to slow access time) or to compute, relative to the cost of reading the cache. In other words, a cache is a temporary storage area where frequently accessed data can be stored for rapid access. Once the data is stored in the cache, future use can be made by accessing the cached copy rather than re-fetching or recomputing the original data, so that the average access time is lower.
Caches have proven to be extremely effective in many areas of computing because access patterns in typical computer applications have locality of reference (temporal and spatial locality). Temporal locality refers to data that are accessed close together in time. Spatial locality refers to data that might or might not be located physically close to each other.
In a clustered cache structure, the cluster may also provide redundant storage for security and backup purposes by replicating data to all instances in the cache cluster. As the number of nodes and cluster increase, a bottleneck may be formed when a processing component is attempting to access the data stored in the cluster. Also, replicating data to all instances may prove to impact memory and network traffic every time another instance is added to a cluster. As such, a need exists for efficiently providing data in a large cache structure.