Abstract

Loop caches provide an effective method for decreasing memory hierarchy energy consumption by storing frequently executed code in a more energy efficient structure than the level one cache. However, due to code structure restrictions and/or costly design time pre-analysis efforts, previous loop cache designs are not suitable for all applications and system scenarios. In this paper, we present an adaptive loop cache that is amenable to a wide range of system scenarios, providing an additional 20% average instruction memory hierarchy energy savings (with individual benchmark energy savings as high as 69%) compared to the best previous loop cache design.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call