Abstract

Accurate reliability predictions of real-world digital logic circuits rely heavily on the relevancy of device-level testing. In the case of bias temperature instability (BTI), where recovery plays a significant role, a leap of faith is taken to translate device-level reliability data into a practical information for the real-world circuit implications. In this paper, we develop a methodology to bridge this gap by employing an eye diagram approach, which allows us to monitor, at circuit speed, device-level random jitter degradation in response to stress. By employing a variety of positive BTI gate voltage stress and sense bit sequences (including dc, ring oscillator (RO), and pseudorandom), we are able to compare the effectiveness of these approaches at capturing random timing jitter. We find that conventional RO-type measurements are unable to capture the random jitter degradation. This calls into question the effectiveness of using RO structures as a proxy for real-random logic circuits. Only when a pseudorandom bit sequence is employed does the true extent of jitter degradation become observable. This is an important development and serves as an accurate means to translate device-level reliability data to predict real-world digital logic circuit degradation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call