Abstract

This work introduces a new protocol which aims to facilitate massive on-wafer characterization of Random Telegraph Noise (RTN) in MOS transistors. The methodology combines the noise spectral density scanning by gate bias assisted with a modified Weighted Time Lag Plot algorithm to identify unequivocally the single-trap RTN signals in optimum bias conditions for their electrical characterization. The strength of the method is demonstrated by its application for monitoring the distribution of traps over the transistors of a SOI wafer. The influence of the back-gate bias on the RTN characteristics of the SOI devices with coupled front- and back-interfaces has revealed unusual characteristics compatible with the carrier emission to the gate metal contact.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call