Abstract

Global exponential stability of a class of neural networks with periodic coefficients and piecewise constant arguments is investigated in this paper. A new definition of exponential stability and a novel differential inequality with piecewise constant arguments are introduced to obtain sufficient conditions for the globally exponential stability of the periodic solution of neural networks. The stability criteria are independent on the upper bound of the adjacent element difference of the discontinuous switching moments. According to the new definition of exponential stability and the novel differential inequality, not only it is not necessary to establish any relationship between the norms of the states with/without piecewise constant arguments, but also the stability criteria for the neural networks can be obtained just in terms of the original differential equation, rather than the equivalent integral equation which is widely used in the early works. Typical numerical examples are utilized to illustrate the validity and improvement in less conservatism of the theoretical results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call