1. Field
The description relates to cache memory systems capable of reducing a cache miss probability and methods of operating the cache memory systems.
2. Description of Related Art
A processor, such as a central processing unit (CPU) or a graphic processing unit (GPU), retrieves a command or data from a large-capacity external memory to process the same. The processing speed of most large-capacity external memories is very low compared to that of a processor, and thus, a cache memory system is used to improve an operation speed of a large-capacity external memory.
A cache memory system stores data that a processor has recently accessed, and if the processor requests the same data again, the cache memory system allows the processor to access a cache memory which has a high processing speed instead of accessing an external memory, thereby improving a data transmission speed.
If data requested by the processor is stored in a data memory of the cache memory system (cache hit), the data in the data memory is transmitted to the processor, and if data requested by the processor is not present (cache miss), the data is read from an external memory. Also, the cache memory system removes one of pieces of cache data stored in the data memory and replaces it with the read data, and the read data is transmitted to the processor.
The cache memory system may be implemented, according to a mapping method, by using a set-associative cache memory that uses a set-associative mapping method and a direct mapped cache memory that uses a direct mapping method. A set-associative cache memory includes a plurality of ways, and in the case of a cache miss, cache data corresponding to a predetermined way is replaced with new data read from an external memory according to a replacement policy.