Abstract

Nanowire Field Effect Transistors (NWFETs) have been considered as the next-generation technology for sub-10 nm technology nodes, succeeding FinFETs. However, the highly confined nature of Nanowire FETs creates reliability issues that significantly impact their performance. Therefore, this work proposes a machine learning-based technique for analyzing the self-heating-induced reliability issues in NWFETs. The influence of self-heating effects in NWFET has been predicted in terms of saturation current (Idsat), threshold voltage (Vth), the maximum carrier temperature along the channel (eTmax), and the maximum Lattice temperature (LTmax) with multivariable regression. TCAD-assisted machine learning has been used for algorithm training and prediction. A dataset has been created by varying the parameters of the NWFETs like the thickness of the channel (tsi), the thickness of oxide (tox), Length of source/drain (Lsd), length of source/drain contact (Lsdc), doping concentrations etc. The Random Forest Regression algorithm has been used to estimate the performance of NWFETs in predicting the desired output parameters suitably with the given dataset.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.