Memory caching is a technique used with computer systems to provide fast access to frequently used data or instructions. In a typical cached system, a fast auxiliary memory provides the cache, under control of a microprogram that moves cached instructions or data in and out of the cache memory as needed.
The general concept is that when a memory word is referenced, it is brought from larger slower memory into the cache, so that the next time it is used, it may be accessed quickly. The underlying premise of caches is that memory references made during any short time period tend to use only a small part of a computer system's total memory. For example, instructions in a program algorithm tend to have a high degree of spatial and temporal proximity. Likewise, data to be processed tends to be predictable in terms of the proximity of the next data to be processed.
For general computing applications, efficient cache system guidelines and techniques have been developed, based on the spatial and temporal proximity characteristics of typical computer programming. However, for some specialized computing applications, these guidelines may no longer apply.
One type of computing application that exhibits unique proximity characteristics is image processing. Image data are more spatially oriented and temporal assumptions do not apply. Other signal processing applications, in addition to image processing, operate on large amounts of data in a manner that is not characteristic of conventional programming. However, these data operations do tend to be predictable with respect to the spatial orientation of the data to be processed.
Because of the differences in data handling for signal processing application, accepted guidelines for cache design for conventional computer programming are not acceptable. A need exists for an improved cache design for signal processing applications.