Abstract

Recurrent Neural Networks (RNNs) are possibly the most prevailing and advantageous type of neural network. On the other hand, these networks still have some weaknesses in terms of learning speed, error convergence, and accuracy due to long-term dependencies, which need to be solved. Long-term dependencies are mainly exploding and vanishing gradients through Back Propagation Learning Algorithm. In this paper, Long Short Term Memory or LSTM is used and well structured for resolving the above concerns. Four different optimizers based on Metaheuristic Algorithms are chosen to train LSTM (these are; Harmony Search (HS), Gray Wolf Optimizer (GWO), Sine Cosine (SCA), and Ant Lion Optimization algorithms (ALOA). The suggested representations are used for classification and analysis of real and medical time series data sets (Breast Cancer Wisconsin Data Set and Epileptic Seizure Recognition Data Set). Classification accuracy measure has been used instead of error rate and mean square error methods to train LSTM with above optimizing algorithms. The experimental results are verified using the 5-fold cross validation. Details of simulations and coding in R programing language can be obtained in the following link “https://github.com/pollaeng/rnn”.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call