1. Technical Field
The present invention is directed to an improved data processing system and in particular to an improved data cache array for utilization in a data processing system. Still more particularly the present invention relates to an improved method and system for concurrent access in a data cache array utilizing multiple match line selection paths.
2. Description of the Related Art
Many systems for processing information include both a system memory and a cache memory. A cache memory is a relatively small, high-speed memory that stores a copy of information from one or more portions of the system memory. Frequently, the cache memory is physically distinct from the system memory. Such a cache memory can be integral with the processor device of the system or non-integral with the processor.
Information may be copied from a portion of the system memory into the cache memory. The information in the cache memory may then be modified. Further, modified information from the cache memory can then be copied back to a portion of the system memory. Accordingly, it is important to map information in the cache memory relative to its location within system memory.
Assuming selection of an appropriately sized cache memory and the efficient storage of data therein the limiting factor in cache performance is the speed of the cache memory and the ability of the system to rapidly write data into or read data from the cache memory. This access speed will necessarily be limited by the width of the access bus and the speed of the memory.
Division of the cache into multiple subarrays may provide some increase in access speed by increasing the effective bandwidth of the access path; however, this approach requires increased addressing complexity and the overhead of maintaining multiple separate subarrays.
In view of the above, it should be apparent that a method and system for increasing the speed of access to a data cache array would be highly desirable.