Abstract

This paper deals with the global exponential stability for delayed recurrent neural networks (DRNNs). By constructing an augmented Lyapunov-Krasovskii functional and adopting the reciprocally convex combination approach and Wirtinger-based integral inequality, delay-dependent global exponential stability criteria are derived in terms of linear matrix inequalities. Meanwhile, a general and effective method on global exponential stability analysis for DRNNs is given through a lemma, where the exponential convergence rate can be estimated. With this lemma, some global asymptotic stability criteria of DRNNs acquired in previous studies can be generalized to global exponential stability ones. Finally, a frequently utilized numerical example is carried out to illustrate the effectiveness and merits of the proposed theoretical results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call