Abstract

Process variations (PV), including global variation (GV) and local variation (LV), have become one of the major issues in advanced technologies, which is crucial for circuit performance and yield. However, developing a mature and physics-based model is challenging and time-consuming. Thus, in this work, we propose a machine learning (ML) based method for device modeling with PV and implement the corresponding circuit simulation, which is demonstrated on advanced Nanosheet FETs (NSFET). Verified by TCAD simulations, the artificial neural network (ANN)-based ML algorithm enables to capture PV, e.g., dimension and work function variations (WFV), with high accuracy and improved efficiency. For GV, the ANN surrogated NSFET-based ring oscillator (RO) simulation results show that the larger width (Wsh) or height (Hsh) of the Nanosheet leads to the higher RO frequency and lower circuit delay. For LV, the respective impacts of grain size and WF on circuit performance can be distinguished. The proposed workflow, from ANN model training to circuit simulation based on the generated Verilog-A model, is fully automatic, promising to shorten the procedure of device modeling and accelerate the development of advanced technologies.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.