Random telegraph noise (RTN) adversely induces time dependent device-to-device variations and requires modeling to optimize circuit design. Many early works were focused under dc test conditions, although digital circuits typically operate under ac conditions and it has been reported that ac RTN is substantially different from dc RTN. Tests on ac RTN were carried out mainly on individual traps, and a reliable statistical distribution of trap time constants for ac RTN is still missing. This prevents verifying the statistical accuracy of Monte Carlo ac RTN simulation based on compact models, especially in terms of their ability to predict ac RTN as time window increases. Recently, an integral methodology has been proposed for dc RTN, which can not only model it at short time but also predict it at long time. By introducing the concept of effective charged traps, the need for statistical distribution of trap time constants is removed, making RTN prediction similar to aging prediction. The objectives of this work are to report statistical experimental ac RTN data and to test the applicability of integral methodology to them. For the first time, it will be shown that a model extracted from a time window of 7.8 s can be used to predict the statistical distribution of long-term ( <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$3\times 10^{{4}}$ </tex-math></inline-formula> s) ac RTN. The dependence of ac RTN on frequency and time window is analyzed, and the contributions of carrier tunneling from gate and substrate are assessed.
Read full abstract