Studies show that neural networks with multilayer direct propagation have be-come more widespread, which is due to the simplicity of their implementation, the availability of advanced learning methods, and the parallel execution of calculations. In these networks, the computational complexity increases proportionally to the square of the data size, which leads to a decrease in performance and an increase in hardware costs. The main problems of synthesis of artificial neural networks, criteria for evaluating the effectiveness of artificial neural networks, issues of choosing layers and the number of neurons in each layer are investigated. In the work, the mean square error, mean absolute error and mean absolute percentage error criteria are considered for evaluating the efficiency of neural networks. In general, one hidden layer of a neu-ral network is sufficient for most problems, two hidden layers can represent a function of any shape, and adding a third and more layers results in very little improvement in network performance. Expressions are given for the total number of parameters de-pending on the number, and the number of inputs from the binary sequence, as well as the minimum and maximum values of the training examples depending on the total number of parameters and the number of outputs. The studies have shown that the se-lection of the statistical model efficiency evaluation criteria depends on the problem to be solved. The results of the simulations performed using the MATLAB package are presented.
Read full abstract