Abstract
This research was conducted to study the Back Propagation ANN that was implemented with the Matlab GUI. Where the data used is the duration of shine data, the network architecture is formed by determining the number of units per layer. After the network is formed, training and testing are carried out from the data that has been grouped before. Furthermore, the prediction stage uses the trainrp training method with the logsig activation function of each layer. The number of input layer neurons is 120, the first hidden layer is 10, the second hidden layer is 10, the output layer is 1. While the maximum epoch parameter is 1000, the goal is 0.001, the learning rate is 0.7 and step is 1. Based on the simulation results January was 51.25%, February was 62.00%, March was 59.29%, April was 64.52%, May was 71.42%, June was 79.32%, July was 64.25%, August was 77.87%, September was 85.02%, October was 81.33%, November of 56.67%, and December of 39.14%. The simulation results were obtained with MSE accuracy of 5.114, MAD of 1.479, MAPE of 2.162, RMSE of 2.261, and accuracy of 97.43%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IJECA (International Journal of Education and Curriculum Application)
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.