Abstract

This paper is concerned with the problem of stability analysis for a class of discrete-time recurrent neural networks with time-varying delays. Under a weak assumption on the activation functions and using a new Lyapunov functional, a delay-dependent condition guaranteeing the global exponential stability of the concerned neural network is obtained in terms of a linear matrix inequality. It is shown that this stability condition is less conservative than some previous ones in the literature. When norm-bounded parameter uncertainties appear in a delayed discrete-time recurrent neural network, a delay-dependent robust exponential stability criterion is also presented. Numerical examples are provided to demonstrate the effectiveness of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call