Abstract
This article discusses the results of research on the combination of learning rate values, momentum, and the number of neurons in the hidden layer of the ANN Backpropagation (ANN-BP) architecture using meta-analysis. This study aims to find out the most recommended values at each learning rate and momentum interval, namely [0.1], as well as the number of neurons in the hidden layer used during the data training process. We conducted a meta-analysis of the use of learning rate, momentum, and number of neurons in the hidden layer of ANN-BP. The eligibility data criteria of 63 data include a learning rate of 44 complete data, the momentum of 30 complete data, and the number of neurons in the hidden layer of 45 complete data. The results of the data analysis showed that the learning rate value was recommended at intervals of 0.1-0.2 with a RE model value of 0.938 (very high), the momentum at intervals of 0.7-0.9 with RE model values of 0.925 (very high), and the number of neurons in the input layer that was smaller than the number of neurons in the hidden layer with a RE model value of 0.932 (very high). This recommendation is obtained from the results of data analysis using JASP by looking at the effect size of the accuracy level of research sample data.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.