Nanowire Field Effect Transistors (NWFETs) have been considered as the next-generation technology for sub-10 nm technology nodes, succeeding FinFETs. However, the highly confined nature of Nanowire FETs creates reliability issues that significantly impact their performance. Therefore, this work proposes a machine learning-based technique for analyzing the self-heating-induced reliability issues in NWFETs. The influence of self-heating effects in NWFET has been predicted in terms of saturation current (Idsat), threshold voltage (Vth), the maximum carrier temperature along the channel (eTmax), and the maximum Lattice temperature (LTmax) with multivariable regression. TCAD-assisted machine learning has been used for algorithm training and prediction. A dataset has been created by varying the parameters of the NWFETs like the thickness of the channel (tsi), the thickness of oxide (tox), Length of source/drain (Lsd), length of source/drain contact (Lsdc), doping concentrations etc. The Random Forest Regression algorithm has been used to estimate the performance of NWFETs in predicting the desired output parameters suitably with the given dataset.
Read full abstract