Abstract

As technology scales, the aging effect caused by Negative Bias Temperature Instability (NBTI) has become a major reliability concerns for circuit designers. Consequently, we have seen a lot of research efforts on NBTI analysis and mitigation techniques. On the other hand, reducing leakage power remains to be one of the major design goals. Both NBTI-induced circuit degradation and standby leakage power have a strong dependency on the input patterns of circuits. In this paper, we propose a co-simulation flow to study NBTI-induced circuit degradation and leakage power, taking into account the different behaviors between circuit active and standby time. Based on this flow, we evaluate the efficacy of Input Vector Control (IVC) technique on mitigating circuit aging and reducing standby leakage power with experiments on bench-mark circuits that are implemented in 90 nm, 65 nm, and 45 nm technology nodes. The IVC technique is proved to be effective to mitigate NBTI-induced circuit degradation, saving up to 56% circuit performance degradation at 65 nm technology node, and on average 30% circuit performance degradation across different technology nodes. Meanwhile, IVC technique can save up to 18% of the worst case leak-age power. Since leakage power and NBTI-induced circuit degradation have different dependencies on the input patterns, we propose to derive Pareto sets for designers to explore trade-offs between the life-time reliability and leakage power.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call