A memory cache is an integral computing component that allows data objects to be transparently stored in locations that are more quickly accessible to processing systems. These data objects may include files, documents, pictures, videos, images, or other similar data objects, including combinations, improvements, and portions thereof. In operation, when a process requires a data object, the data object may be retrieved from a storage system, such as a hard disk or solid state drive, and stored in the cache memory to be more quickly accessible to the processing system. In addition to the desired location or portion of the data object, subsequent portions may also be stored in the memory cache as a prediction of future calls to the storage system. Thus, rather than making a plurality of calls, data objects that are likely to be retrieved in succession may be stored in the memory cache using fewer read operations.
Although predicting future memory calls may be beneficial if a single process is being executed, it may be inefficient when multiple processes require data objects for processing. For example, a first process may retrieve a particular data segment that corresponds to the requested and predicted data for the data process. At or about the same time, one or more other processes may also attempt to store data segments in the memory cache to provide quicker access to the data segments. Thus, if each request requires a large portion of the memory cache, data within the cache may be consistently overwritten to provide the necessary data to each of the processes. This consistent overwriting often referred to as thrashing, is inefficient and may slow each of the processes executing on the processing system.