Abstract

1/f noise and random telegraph signal (RTS) noise are increasingly dominant sources of low-frequency noise as the MOSFET enters the nanoscale regime. In this study, 1/f noise and RTS noise in the n-channel MOSFET are modelled in the time domain for efficient implementation in transient circuit simulation. A technique based on sum-of-sinusoids models 1/f noise while a Monte Carlo based technique is used to generate RTS noise. Low-frequency noise generated using these models exhibits the correct form of noise characteristics as predicted by theory, with noise parameters from standard 0.35-mum and 35-nm CMOS technology. Implementation of the time-domain model in SPICE shows the utility of the noisy MOSFET model in simulating the effect of low-frequency noise on the operation of deep-submicrometer analog integrated circuits.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call