Abstract

In this paper, we study the global exponential stability in a Lagrange sense for recurrent neural networks with both time-varying delays and general activation functions. Based on assuming that the activation functions are neither bounded nor monotonous or differentiable, several algebraic criterions in linear matrix inequality form for the global exponential stability in a Lagrange sense of the neural networks are obtained by virtue of Lyapunov functions and Halanay delay differential inequality. Meanwhile, the estimations of the globally exponentially attractive sets are given out. The results derived here are more general than that of the existing reference. Finally, two examples are given and analyzed to demonstrate our results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call