Abstract This article presents a novel machine learning-based framework for modeling the drain current of InGaAs FinFETs. An artificial neural network, trained on a comprehensive dataset generated by 3D numerical device simulations, forms the core of this model. The simulations covered a wide range of device geometries and bias conditions, resulting in diverse current-voltage characteristics. Using TensorFlow in a Python environment, we built and trained a neural network to accurately predict drain current. The model’s performance was evaluated on independent test datasets, demonstrating its effectiveness on unseen data. The model accurately captures the effects of varying gate length, drain voltage, and gate voltage. Multiple error metrics, including mean squared error, mean absolute error, root mean squared error, and correlation coefficient, were used to assess its performance. At shorter gate lengths, the calculated relative error in drain current was 2.18% and 1.17% at VDS=50mV and 1V, respectively.
After training, the network parameters were extracted to create a Verilog-A model. This model was integrated into the SPICE simulator for circuit simulations. A resistive load inverter circuit was designed to demonstrate the model’s capabilities, and voltage
transfer characteristics and transient analysis were performed within SPICE. These results validate the effectiveness of the developed machine learning-based InGaAs FinFET model.
Read full abstract