Abstract

We present a new simulation method for predicting the data retention time distribution including random dopant fluctuation (RDF). It is a critical problem that a long simulation time is required to predict a wide range of retention time distributions, but our model has extremely fast computation speed compared with the Monte–Carlo method and machine learning method. The proposed model is applied to the dynamic random access memory cell transistor of the 20 nm technology node. The simulation results show that RDF degrades data retention time, particularly in tail cells, and this can be effectively improved by reducing the interface trap density and the drain doping density.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call