Abstract

In this paper, we study the global exponential stability in Lagrange sense for continuous recurrent neural networks (RNNs) with multiple time delays. Three different types of activation functions are considered, which include both bounded and unbounded activation functions. By constructing appropriate Lyapunov-like functions, we provide easily verifiable criteria for the boundedness and global exponential attractivity of RNNs. These results can be applied to analyze monostable as well as multistable neural networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call