Abstract

Random telegraph signals (RTS) are two or more level switching events observed at the drain current or voltage of a MOSFET, which originate from the traps at the Si/SiO 2 interface through the process of capture and emission of charge carriers. Even though there are several available models for low-frequency noise in MOSFETs today, none of them provide modeling tools for RTS. A model has been developed for RTS at the drain of sub-micron scale MOSFETs. The RTS power spectral density is given in terms of three parameters, which fully characterize the RTS, namely capture time, emission time and RTS amplitude. These three parameters are expressed in the terms of the device physical parameters, biasing conditions and temperature, through seven independent modeling parameters: the trap position, x T and y T, trap energy, E T− E Cox, capture cross-section, σ 0, trap binding energy, Δ E B, and empirical fitting constants for the screened scattering coefficient, K 1 and K 2. The model was tested through RTS data obtained on sub-micron LDD n-MOSFETs. Results were compared with the model and fitting parameters were extracted. The trap position x T was found to be 13 Å, close to the Si–SiO 2 interface compared to the oxide thickness of 50 Å. y T is 0.12 μm, indicating the trap is located close to the drain side. The σ 0 obtained from the fittings was 7 × 10 −20 cm 2 at V GS=1.2 V and increased with gate voltage. Δ E B was 0.3 eV. The K 1 and K 2 values were evaluated to be 3 × 10 −13 and −2.5 × 10 −16 V s, respectively. The extracted parameters are comparable to the reported values measured on similar devices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call