Abstract

This article describes a comprehensive framework for analysis of time-dependent performance-reliability degradation of an SRAM cache, considering cache configurations, process parameters and their variations, supply voltage, and aging. The framework consists of three parts: microprocessor emulation, activity extraction, and evaluation of performance-reliability metrics. Evaluation of performance-reliability metrics is implemented with a prediction engine involving regression models for the metrics, which evaluates degradation due to various wearout mechanisms, including bias temperature instability (BTI), hot carrier injection (HCI), and random telegraph noise (RTN). The regression models not only enable more than 100× faster computation compared with SPICE simulations but also protect intellectual property. This framework has been applied to study how SRAM instruction cache (I-Cache) configurations, cell structure, inclusion of RTN and gate length variation, voltage scaling, and stress time affect the performance and reliability parameters, such as access time, leakage power, critical charge ( Q <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">crit</sub> ), and static noise margin (SNM). We have also studied the impact of configuration parameters on the soft error rate (SER) and the hit rate of the I-Cache, and the impact of single error correction and double error detection (SECDED) error correcting codes (ECCs).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call