Abstract

This paper analyzes the robustness of global exponential stability of delayed recurrent neural networks (DRNNs) subject to parameter uncertainty in connection weight matrices. Given a globally exponentially stable DRNNs, the problem to be addressed herein is how much parameter uncertainty in the connection weight matrices that the neural network can remain to be globally exponentially stable. We characterize the upper bounds of the parameter uncertainty for the DRNNs to sustain global exponential stability. The upper bounds of parameter uncertainty intensity are characterized by using transcendental equations. Moreover, we prove theoretically that, for globally exponentially stable DRNNs, if additive parameter uncertainties in connection weight matrices are smaller than the derived supper bounds arrived at here, then the perturbed DRNNs are guaranteed to also be globally exponentially stable. A numerical example is provided to illustrate the theoretical results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call