Abstract

This paper is concerned with analysis problem for the global exponential stability of a class of recurrent neural networks (RNNs) with mixed discrete and distributed delays. We first prove the existence and uniqueness of the equilibrium point under mild conditions, assuming neither differentiability nor strict monotonicity for the activation function. Then, by employing a new Lyapunov–Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the RNNs to be globally exponentially stable. Therefore, the global exponential stability of the delayed RNNs can be easily checked by utilizing the numerically efficient Matlab LMI toolbox, and no tuning of parameters is required. A simulation example is exploited to show the usefulness of the derived LMI-based stability conditions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.