Abstract
The variability in trap-assisted tunneling leakage that is enhanced by random discrete dopants (RDD) causes refresh failure in scaled 6F <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> dynamic random-access memory (DRAM) cells. Thus, the worst-case leakage analysis is in high demand, but it requires significant computational cost. To overcome this issue, we performed 200 leakage variability simulations with RDD and a single trap to train a multi-layer neural-network (NN) model. Moreover, we propose a simulation flow using the NN model to find the worst RDD configuration among 5,000 candidates. We demonstrate the worst-case leakage can be found with 96.7% probability using only 5.5% computational cost compared to a full 3D TCAD statistical simulation approach.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.