Abstract

In this paper, we propose a robust initialization of a Jordan network with a recurrent constrained learning (RIJNRCL) algorithm for multilayered recurrent neural networks (RNNs). This novel algorithm is based on the constrained learning concept of the Jordan network with a recurrent sensitivity and weight convergence analysis, which is used to obtain a tradeoff between the training and testing errors. In addition to using classical techniques for the adaptive learning rate and the adaptive dead zone, RIJNRCL employs a recurrent constrained parameter matrix to switch off excessive contributions from the hidden layer neurons based on weight convergence and stability conditions of the multilayered RNNs. It is well known that a good response from the hidden layer neurons and proper initialization play a dominant role in avoiding local minima in multilayered RNNs. The new RIJNRCL algorithm solves the twin problems of weight initialization and selection of the hidden layer neurons via a novel recurrent sensitivity ratio analysis. We provide the detailed steps for using RIJNRCL in a few benchmark time-series prediction problems and show that the proposed algorithm achieves superior generalization performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call