Abstract

This paper demonstrates a novel application of the deep learning (DL) model designed by a genetic algorithm (GA) for modelling the figure of merits (FOMs) of gallium nitride (GaN) based high electron mobility transistors (HEMTs). To the best of our knowledge, it is the first paper that presents an end-to-end methodology to automate the optimization process of a DL model specifically for semiconductor devices. Such design automation of a DL model substantially reduces the barrier to adopting the DL techniques and encourages researchers from the semiconductor field to adopt the potential of deep learning techniques. To optimize and train the DL model, a dataset of 2160 unique GaN HEMTs for various epitaxial structures has been generated by experimentally validated simulation methodology. Four figure of merits (FOMs) i.e., on current (Ion), maximum transconductance (Gm), subthreshold slope (SS) and threshold voltage (Vth) have been extracted to generate the dataset. Any combination of the three FOMs can be used as inputs to train the DL model for predicting the fourth one. In this particular application, Ion, Gm, and SS are used to predict the Vth of the GaN HEMT. The GA converges after 15 generations for the population size of 250. The optimized solution obtained from the GA for the DL model has a batch size of 8, adam optimizer, 4 number of hidden layers with 16, 8, 16 and 16 neurons, respectively, RELU activation function and zero dropout rate. The optimized DL model provides accurate predictions of the FOMs even for unseen inputs at a very low computational cost in real-time (within a second). The performance of the optimized DL model has been also compared with Lasso, Ridge and support vector regression models and their mean square error and R-square are found to be 0.05, 1, 0.29, 0.17 and 0.978, 0.59, 0.88, 0.931, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call