Abstract

Due to the circuit aging effect, the minimum leakage vector (MLV) found by the traditional input vector control method may not obtain the optimal leakage power reduction result when the circuit begins to degrade. To solve this problem, we present an adaptive MLV selection strategy based on a linear programming approach. The method divides the total lifetime of the circuit into a succession of time intervals, and the MLV used in each interval is periodically updated according to the transistor׳s threshold voltage degradation so that the best overall power reduction result can be achieved. Experimental results on various benchmark circuits show the effectiveness of our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call