Abstract
This paper presents new results on the absolute exponential stability (AEST) of neural networks with a general class of partially Lipschitz continuous and monotone increasing activation functions under a mild condition that the interconnection matrix T of the network system is additively diagonally stable; i.e., for any positive diagonal matrix D/sub 1/, there exists a positive diagonal matrix D/sub 2/ such that D/sub 2/ (T-D/sub 1/)+(T-D/sub 1/)/sup T/D/sub 2/ is negative definite. This result means that the neural networks with additively diagonally stable interconnection matrices are guaranteed to be globally exponentially stable for any neuron activation functions in the above class, any constant input vectors and any other network parameters. The additively diagonally stable interconnection matrices include diagonally semistable ones and H-matrices with nonpositive diagonal elements as special cases. The obtained AEST result substantially extends the existing ones in the literature on absolute stability (ABST) of neural networks. The additive diagonal stability condition is shown to be necessary and sufficient for AEST of neural networks with two neurons. Summary and discussion of the known results about ABST and AEST of neural networks are also given.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.