Abstract

This paper reveals two important characterizations of global exponential stability (GES) of a generic class of continuous-time recurrent neural networks. First, we show that GES of the neural networks can be fully characterized by global asymptotic stability (GAS) of the networks plus the condition that the maximum abscissa of spectral set of Jacobian matrix of the neural networks at the unique equilibrium point is less than zero. This result provides a very useful and direct way to distinguish GES from GAS for the neural networks. Second, we show that when the neural networks have small state feedback coefficients, the supremum of exponential convergence rates (ECRs) of trajectories of the neural networks is exactly equal to the absolute value of the maximum abscissa of spectral set of Jacobian matrix of the neural networks at the unique equilibrium point. Here, the supremum of ECRs indicates the potentially fastest speed of trajectory convergence. The obtained results are helpful in understanding the essence of GES and clarifying the difference between GES and GAS of the continuous-time recurrent neural networks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call