Abstract

As technology scales, negative bias temperature instability (NBTI) becomes one of the primary failure mechanisms for Very Large Scale Integration (VLSI) circuits. Meanwhile, the leakage power increases dramatically as the supply/threshold voltage continues to scale down. These two issues pose severe reliability problems for complementary metal oxide semiconductor (CMOS) devices. Because both the NBTI and leakage are dependent on the input vector of the circuit, we present an input vector control (IVC) method based on a linear programming algorithm, which can co-optimize circuit aging and power dissipation simultaneously. In addition, our proposed IVC method is combined with the supply voltage assignment technique to further reduce delay degradation and leakage power. Experimental results on various circuits show the effectiveness of the proposed combination method.

Highlights

  • As technology scales, reliability issues have become a vital concern in Very Large Scale Integration (VLSI) design

  • negative bias temperature instability (NBTI) occurs when positive-channel Metal Oxide Semiconductor transistors are negatively biased (Vgs = −Vdd ), which causes a shift in the threshold voltages (Vth )

  • If the transistor is in the stress phase in the standby mode, the duty cycle (α) for this transistor is α = (c × R AS + 1)/( R AS + 1). Another important issue that should be considered in the NBTI model is the stacking effect when multiple transistors are connected in series

Read more

Summary

Introduction

Reliability issues have become a vital concern in Very Large Scale Integration (VLSI) design. Chen et al, proposed a supply voltage assignment method to co-optimize power consumption and NBTI effect for the CMOS devices [10] Among these methods, supply voltage assignment (SVA) is commonly used because of its easy implementation, and we use SVA to compensate for NBTI-induced delay degradation in this paper. A high Vdd will result in an increase of power consumption, as well as an accelerated aging process Because both the NBTI effect and leakage power are dependent on the input vector in standby mode, the circuit. In order to solve this problem, a novel NBTI and leakage co-optimization algorithm based on an ILP formulation is proposed in this paper This method can consider these two issues simultaneously and find the optimal input vector that can provide a balanced tradeoff between performance and power.

NBTI-Induced Transistor Aging
Path-Based NBTI Model
Cell-Based Leakage Power Model
ILP Formulation for NBTI Mitigation and Leakage Reduction Only
ILP Formulation
Objective
ILP Formulation for Leakage Reduction
Supply Voltage
Minimum NBTI Vector Selection Considering Power Effect
Design
Result and Discusion
Findings
Leakage
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call