Abstract

Drain read disturb (DRD) is becoming an intrinsic reliability concern for NOR flash scaling and multilevel cell operation. There is a tradeoff between reducing this concern via drain voltage reduction and the effect of that reduction on random telegraph signal (RTS) noise and measured charge detrapping for cycled cells. A DRD time-to-error model has been generated, which takes into consideration voltage dependence, read cycling, and Poisson random statistics. This model can be used for wafer-level tests that allow the quantification of the read window budget tradeoff of the drain voltage with DRD, RTS, and charge-detrapping effects. Models for the read drain voltage effect on RTS and charge detrapping are also presented. These models show that RTS increases with reduced drain voltage due to the mobility effect of the increased gate field. Together, the models allow the optimization of read drain voltage for a given product usage model and the wafer-level assessment of process improvements to ensure that products meet the reliability requirements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call