Abstract
1/f noise and random telegraph signal (RTS) noise are increasingly dominant sources of low-frequency noise as the MOSFET enters the nanoscale regime. In this study, 1/f noise and RTS noise in the n-channel MOSFET are modelled in the time domain for efficient implementation in transient circuit simulation. A technique based on sum-of-sinusoids models 1/f noise while a Monte Carlo based technique is used to generate RTS noise. Low-frequency noise generated using these models exhibits the correct form of noise characteristics as predicted by theory, with noise parameters from standard 0.35-mum and 35-nm CMOS technology. Implementation of the time-domain model in SPICE shows the utility of the noisy MOSFET model in simulating the effect of low-frequency noise on the operation of deep-submicrometer analog integrated circuits.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
More From: IEEE Transactions on Circuits and Systems I: Regular Papers
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.