Abstract

This paper presents new results on global asymptotic stability (GAS) and global exponential stability (GES) of a general class of continuous-time recurrent neural networks with Lipschitz continuous and monotone nondecreasing activation functions. We first give three sufficient conditions for the GAS of neural networks. These testable sufficient conditions differ from and improve upon existing ones. We then extend an existing GAS result to GES one and also extend the existing GES results to more general cases with less restrictive connection weight matrices and/or partially Lipschitz activation functions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call