Abstract

In this paper, we investigate the problem of global exponential dissipativity of neural networks with variable delays and impulses. The impulses are classified into three classes: input disturbances, stabilizing and “neutral” type—the impulses are neither helpful for stabilizing nor destabilizing the neural networks. We handle the three types of impulses in a uniform way by using the excellent ideology introduced recently. To this end, we propose new techniques which coupled with more general Lyapunov functions to realize the ideology and it is shown that they are more effective. Exponential dissipativity conditions are established in terms of linear matrix inequalities (LMIs) and these conditions can be straightforwardly reduced to exponential stability conditions. Numerical results are given to show that the obtained conditions are effective and less conservative than the existing ones.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call