Abstract
Recurrent neural network (RNN) has gained much attention from researchers working in the domain of time series data processing and proved to be an ideal choice for processing such data. As a result, several studies have been conducted on analyzing the time series data and data processing through a variety of RNN techniques. However, every type of RNN has its own flaws. Simple Recurrent Neural Networks (SRNN) are computationally less complex than other types of RNN such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU). However, SRNN has some drawbacks such as vanishing gradient problem that makes it difficult to train when dealing with long term dependencies. The vanishing gradient exists during the training process of SRNN due to the multiplication of the gradient with small value when using the most traditional optimization algorithm the Gradient Decent (GD). Therefore, researches intend to overcome such limitations by utilizing weight optimized techniques such as metaheuristic algorithms. The objective of this paper is to present an extensive review of the challenges and issues of RNN weight optimization techniques and critically analyses the existing proposed techniques. The authors believed that the conducted review would serve as a main source of the techniques and methods used to resolve the problem of RNN time series data and data processing. Furthermore, current challenges and issues are deliberated to find promising research domains for further study.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.