Abstract

In this study, we propose a deep neural network (DNN) model that extracts the subgap states in the channel layer of oxide thin-film transistors. We have developed a framework that includes creating a model training set, preprocessing the data, optimizing the model structure, decoding from density-of-state (DOS) parameters to current–voltage (I–V) characteristics, and evaluating the model performance and accuracy of curve fitting. We investigate in detail the effect of data preprocessing methods and model structure on the performance of the model. The primary finding is that the input data type and the last hidden layer significantly affects the performance of the regression model. Using double-type input data composed of several voltages and linear current values is more advantageous than using log-scale current. Moreover, the number of nodes in the last hidden layer of a regression model with multiple output nodes should be large enough to avoid interference between the output values. The proposed model outputs five DOS parameters, and the resulting parameters are decoded to an I–V curve through interpolation based on the nearest 32 data from the given dataset. We evaluate the model performance using the threshold voltage and on-current difference between a target curve and the decoded curve. The proposed model calibrates 97.1% of the 14,400 curves within the threshold voltage difference of 0.2V and on-current error of 5%. Hence, the proposed model is verified to effectively extract DOS parameters with high accuracy based on the current characteristics of oxide thin-film transistors. We expect to improve the efficiency of defect analysis by replacing the iterative manual technology computer aided design (TCAD) curve fitting with an automatic DNN model.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.