The present invention relates generally to caching data, and more particularly to caching data on a hard disk drive (HDD) for better read performance.
Conventionally, cache is a portion of memory (i.e. random access memory; RAM) used to temporarily store frequently accessed data. Caching is the process of copying the frequently accessed data to memory where the data is expensive (time wise) to fetch in its original location.
As software applications grow ever larger, so too does the amount of data required to run the applications. Additionally, modern computers are able to multitask a significant number of applications further reducing the available space within memory to store frequently accessed data.
Memory being a finite resource, it is inevitable that an application will request data not residing in cache requiring a trip to the hard disk drive (HDD) to retrieve said data. Hard disk drives (HDD) being an order of magnitude slower than memory account for a large portion of the expense (time wise) involved in computing.