Abstract
Recurrent neural networks (RNNs) have emerged as a promising tool in modeling nonlinear dynamical systems. The convergence is one of the most important issues of concern among the dynamical properties for the RNNs in practical applications. The reason is that the viability of many applications of RNNs depends on their convergence properties. We study in this paper the convergence properties of the weighted state space search algorithm (WSSSA) -- a derivative-freeand non-random learning algorithm which searches the neighborhood of the target trajectory in the state space instead of the parameter space. Because there is no computation of partial derivatives involved, the WSSSA has a couple of salient features such as simple,fast and cost effective. In this study we provide a necessary andsufficient condition that required for the convergence of the WSSSA. Restrictions are offered that may help assure convergence of the ofthe WSSSA to the desired solution. The asymptotic rate ofconvergence is also analyzed. Our study gives insights into theproblem and provides useful information for the actual design of theRNNs. A numerical example is given to support the theoreticalanalysis and to demonstrate that it is applicable to manyapplications.
Published Version (
Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have