Abstract

AbstractGradient descent learning algorithms for recurrent neural networks (RNNs) perform poorly on long-term dependency problems. In this paper, we propose a novel architecture called Segmented-Memory Recurrent Neural Network (SMRNN). The SMRNN is trained using an extended real time recurrent learning algorithm, which is gradient-based. We tested the SMRNN on the standard problem of information latching. Our implementation results indicate that gradient descent learning is more effective in SMRNN than in standard RNNs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call