Abstract

Random telegraph noise (RTN) adversely impacts circuit performance and this impact increases for smaller devices and lower operation voltage. To optimize the circuit design, many efforts have been made to model RTN. RTN is highly stochastic, with significant device-to-device variations (DDVs). Early works often characterize individual traps first and then group them together to extract their statistical distributions. This bottom-up approach suffers from limitations in the number of traps it is possible to measure, especially for the capture and emission time constants, calling the reliability of extracted distributions into question. Several compact models have been proposed, but their ability to predict long-term RTN is not verified. Many early works measured RTN only for tens of seconds, although a longer time window increases RTN by capturing slower traps. The aim of this work is to propose an integral methodology for modeling RTN and, for the first time, to verify its capability of predicting the long-term RTN. Instead of characterizing properties of individual traps/devices, the RTN of multiple devices was integrated to form one dataset for extracting their statistical properties. This allows using the concept of effective charged traps (ECTs) and transforms the need for time constant distribution to obtain the kinetics of ECT, making long-term RTN prediction similar to predicting aging. The proposed methodology opens the way for assessing RTN impact within a window of ten years by efficiently evaluating the probability of a device parameter at a given level.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call