Abstract

The task of creating mathematical software for constructing quantitative dependency models based on forward propagation neural networks has been solved in the work. A modification of method for dropping out neurons is proposed, which better prevents the model from overfitting. The modified method takes into account the effect of each neuron on the model error. It is proposed to increase the probability of dropping out of neurons that more affect the model error and to decrease the probability of dropping out of neurons that less affect the model error. New probabilities of dropping out of neurons depend not on the degree of influence on the error, but on the number of neurons on the same layer that affect the error more or less. The probability of dropping out of a neuron with the smallest influence on the error decreases by 50 % and for a neuron with the largest influence on the error increases by 50 % of the base probability. To calculate the dropping out probabilities of all neurons, it is proposed to use a sigmoid function with a nonlinearity coefficient. The mean probability of dropping out of neurons remains unchanged, so that modifications in the method relate only to the learning process. Despite the fact that the training of the neural networks by the proposed method takes more time, the quality of the trained models increases. The practical problem of determining the critical pitting temperatures of AiSi 321 steel by its characteristics has been solved. The construction of neural network models, their training and testing on the data on the characteristics of steel has been performed. The constructed models differ in the number of neurons on the hidden layer and the base probability of dropping out of neurons. Each model was trained by three methods: without dropping out of neurons, with the usual method of dropping out and with a modified method of dropping out. The test results of all constructed models have been compared. The average error on the test data when using the modified method of dropping out is about 9 % less than when using the usual method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.