Abstract

With the downscaling of device dimensions, the variability of metal–oxide–semiconductor field-effect transistor (MOSFET) electrical behavior is produced by factors other than variations in physical dimensions and doping profiles, which are there since device fabrication and remain static over time. Besides these time-zero variability factors, factors that lead to performance variability from one instant in time to the other start playing a significant role. Random telegraph noise (RTN) is among these relevant time-dependent variability sources, causing the threshold voltage ( ${V}_{T}$ ) of a transistor to change from one instant in time to the other. In this work, we extend the knowledge of the time-dependent random variability induced by RTN, by providing a statistical model for RTN-induced threshold voltage variance over time. The area scaling of threshold voltage variance is detailed and discussed, supporting designers in transistor sizing toward a more reliable design. Not only the threshold voltage variance expected in a single transistor is modeled but also its variability among transistors that by design should behave the same way. It is shown that with device size downscaling, the variability of RTN among devices increases faster than its mean value. The relationship between the time domain (RTN) and frequency domain (low-frequency noise) is studied. The modeling provides equations to extract trap amplitude contribution and trap densities from the moments of a measured low-frequency noise distribution or from the moments of a measured ${V}_{T}$ variation distribution, without having to characterize individual step heights. Besides analytical modeling, Monte Carlo simulations are run, illustrating the RTN-induced time-dependent random ${V}_{T}$ variation and model applicability.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call