Abstract

The training algorithm is the main driver in artificial neural networks. The performance of the training algorithm is influenced by several parameters including the number of neurons in the input and hidden layers, epoch maximum, activation function, and the size of the learning rate (lr). One of the benchmarks for optimizing the performance of training algorithms can be viewed from the error or MSE (mean square error) produced. The smaller the error, the more optimal the performance. The test conducted in the previous study obtained information that the most optimal training algorithm based on the smallest MSE produced was Levenberg-Marquardt (LM) with an average MSE=0.001 at the level of α=5% and using 10 neurons in the hidden layer. Therefore, this study aims to test the LM algorithm using several variations in the number of neurons in hidden layers. In this study, the LM algorithm was tested using 5 neurons in the input layer, and 2, 4, 5, 7, 9 neurons in hidden layers, and the same parameters as previous study. This study uses a mixed method that is developing computer programs and quantitative testing of program output data using statistical test. The results showed that the LM algorithm had the most optimal performance using 9 neurons in the hidden layer at the level of lr=0.5 with the smallest error of 0.000137501±0.000178355.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call