The present invention relates to a storage system having store data buffering means. More particularly, it relates to a storage system well suited to a data processing system which includes a cache for retaining copies of some of the data retained in a main storage and which is controlled by a pipeline system wherein "fetch" and "store" operations are simultaneously generated for the cache.
Storages (a main storage and a cache) in a data processing system are controlled to perform the respective operations of fetching data, namely, an instructon and operands, and of storing an operand obtained as the result of instruction execution. Further, since the cache has the character of a copy of the main storage, the "store" operation involving data storage into a buffer storage occurs therein on the basis of a so-called block transfer in which a block composed of a certain number of bytes is fetched from the main storage and registered in the buffer storage. In the cache in a data processing system of the type which is capable of pipeline operation, these operations of instruction fetch, operand fetch, store and block transfer can occur at the same time. Requests therefor are given priority levels, and when they occur, the buffer storage is accessed in the order of (1) block transfer, (2) store, (3) operand fetch and (4) instruction fetch. When such requests for access to the cache are in contention, the processing of the low priority level is deferred to slow the instruction processing.
In order to lessen the contention of the access requests for the cache, an improved system has also been proposed in which the first half of one machine cycle is allotted to the store of data into the cache, and the latter half thereof is allotted to the fetch of data from the cache, so as to prevent contention between the store operation and the operand or instruction fetch operation from arising. This system, however, has the problem that the access time for the cache becomes equal to a half machine cycle, so cache constituents of high speed are required.
Japanese Patent Application Publication No. 53-24260 discloses a data processing system which avoids the competition of requests for access to a cache. This system, however, does not disclose a technique for reducing the contention between a store request and a fetch request which may arise from a data processing unit.