Abstract

In this paper we derive a condition for robust local stability of multilayer recurrent neural networks with two hidden layers. The stability condition follows from linking theory about linearization, robustness analysis of linear systems under nonlinear perturbation and matrix inequalities. A characterization of the basin of attraction of the origin is given in terms of the level set of a quadratic Lyapunov function. In a similar way like for NL theory, local stability is imposed around the origin and the apparent basin of attraction is made large by applying the criterion, while the proven basin of attraction is relatively small due to conservatism of the criterion. Modifying dynamic backpropagation by the new stability condition is discussed and illustrated by simulation examples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call