Abstract

Replacement policy plays a major role in improving the performance of the modern highly associative cache memo-ries. As the demand of data intensive application is increasing it is highly required that the size of the Last Level Cache (LLC) must be increased. Increasing the size of the LLC also increases the associativity of the cache. Modern LLCs are divided into multiple banks where each bank is a set-associative cache. The replacement policy implemented on such highly associative banks consume significant hardware (storage and area) overhead. Also the Least Recently Used (LRU) based replacement policy has an issue of dead blocks. A block in the cache is called dead, if the block is not used in the future before its eviction from the cache. In LRU policy, a dead block can not be remove early until it become LRU-block. So, we have proposed a replacement technique which is capable of removing dead block early with reduced hardware cost between 77% to 91% in comparison to baseline techniques. In this policy random replacement is used for 70% ways and LRU is applied for rest of the ways. The early eviction of dead blocks also improves the performance of the system by 5%.

Highlights

  • Replacement policy plays the most significant role in the performance of highly set-associative cache architecture

  • The simplest replacement policy is known as FIFO (Fisrt In First Out) which uses a straight forward strategy to replace victim block while the most widely used traditional replacement policy is LRU (Least Recently Used) which selects a victim block based on reference history

  • Replacement policy plays a major role in improving the performance of the modern highly associative cache memories

Read more

Summary

INTRODUCTION

Replacement policy plays the most significant role in the performance of highly set-associative cache architecture. The three important operations of any replacement policy are: Today’s data intensive applications demand larger and higher associative cache (specially LLC) These highly associative cache reduces the conflict misses and improves the performance of the system. In case of LRU replacement policy, the eviction mechanism always selects the least recently used block as a victim block. In case of LRU replacement policy, the promotion mechanism makes such blocks as MRU It means that if a block B present in the bank and a request has been made to access the block. The proposed policy attempts to reduce hardware cost of LRU based techniques with high dead block prediction ability so that hit rate of the memory can be improved.

BACKGROUND
SMALL-LRU
Replacement Operations
Advantage of Small-LRU
EXPERIMENTAL ANALYSIS
Result Analysis with Baseline-1
Result Analysis with Baseline-2
Findings
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call