1. Field of the Invention:
This invention relates to computing systems and to handling sequential read streams in the computing systems. More particularly, the invention relates to optimizing system performance during disk sequential read streams while minimizing the impact of such optimization on cache operations and other non-sequential read applications of the computing system.
2. Description of the Related Art:
In a computing system having cache memory and large volume storage devices, such as disk drives and tape drives, it is desirable to transfer information from a large volume storage device to cache memory. Relative to the speed of the computer processor, the time to access a record in a large volume storage device is very slow while the time to access a record in cache memory is quite fast. Where the application program being run by the computing system is using sequential records, the performance of the system is enhanced by prefetching records from a large volume storage drive such as a disk drive and loading these records in cache memory just prior to a request for the records from the processor. Then when the read record request is received from the processor, the record is rapidly read from cache.
The prefetching of records from a large volume storage device is known to have three problems. The first problem is determining under what conditions the system should perform a prefetch. Since prefetching is most effective when reading sequential records, the first problem is really how to determine that the system is reading sequential records. The second problem is determining the size of the record data block to be prefetched. Prefetching data from the disk drive loads down the disk drive relative to access to the drive by other applications. Therefore, the time spent in prefetching should be as small as possible, or in other words, how small can the number of prefetched data blocks be and still accomplish the prefetch goals. The third problem is how long should prefetched data remain in cache. If the cache is determining loaded with large volumes of prefetched sequential records, then random access records for other applications are squeezed out of cache memory.