Abstract

This paper studies the global output convergence of a class of recurrent neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions and locally Lipschitz continuous time-varying inputs. We establish two sufficient conditions for global output convergence of this class of neural networks. Symmetry in the connection weight matrix is not required in the present results which extend the existing ones.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call