Abstract
<span lang="EN-US">This study implements a recurrent neural network (RNN) by comparing two RNN network structures, namely Elman and Jordan using the backpropagation through time (BPTT) programming algorithm in the training and forecasting process in foreign exchange forecasting cases. The activation functions used are the linear transfer function, the tan-sigmoid transfer function (Tansig), and the log-sigmoid transfer function (Logsig), which are applied to the hidden and output layers. The application of the activation function results in the log-sigmoid transfer function being the most appropriate activation function for the hidden layer, while the linear transfer function is the most appropriate activation function for the output layer. Based on the results of training and forecasting the USD against IDR currency, the Elman BPTT method is better than the Jordan BPTT method, with the best iteration being the 4000<sup>th</sup> iteration for both. The lowest root mean square error (RMSE) values for training and forecasting produced by Elman BPTT were 0.073477 and 122.15 the following day, while the Jordan backpropagation RNN method yielded 0.130317 and 222.96 also the following day.</span><p> </p>
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Electrical and Computer Engineering (IJECE)
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.