Abstract

Further results on the robustness of the global exponential stability of recurrent neural network with piecewise constant arguments and neutral terms (NPRNN) subject to uncertain connection weights are presented in this paper. Estimating the upper bounds of the two categories of interference factors and establishing a measuring mechanism for uncertain dual connection weights are the core tasks and challenges. Hence, on the one hand, the new sufficient criteria for the upper bounds of neutral terms and piecewise arguments to guarantee the global exponential stability of NPRNN are provided. On the other hand, the allowed enclosed region of dual connection weights is characterized by a four-variable transcendental equation based on the preceding stable NPRNN. In this way, two interference factors and dual uncertain connection weights are mutually restricted in the model of parameter-uncertainty NPRNN, which leads to a dynamic evolution relationship. Finally, the numerical simulation comparisons with stable and unstable cases are provided to verify the effectiveness of the deduced results.

Highlights

  • Since recurrent neural networks (RNNs) have the ability of parallel processing, distributed information storage, and associative deep learning, a series of neural networks such as Hopfield neural network, Cohen–Grossberg neural network, cellular neural network, BAM neural network, high-order cellular neural network, and shunt inhibition neural network, which are the typical representatives of RNNs, have attracted extensive attention over the years

  • The subsequent perturbations will be attached to RNNs to further examine and guarantee the robustness of global exponential stability (RoGES) of RNNs

  • There is hardly any studies aiming for the RoGES of the recurrent neural network with neutral terms and piecewise constant arguments (NPRNN) with uncertain connection weights

Read more

Summary

Introduction

Since recurrent neural networks (RNNs) have the ability of parallel processing, distributed information storage, and associative deep learning, a series of neural networks such as Hopfield neural network, Cohen–Grossberg neural network, cellular neural network, BAM neural network, high-order cellular neural network, and shunt inhibition neural network, which are the typical representatives of RNNs, have attracted extensive attention over the years. There is hardly any studies aiming for the RoGES of the recurrent neural network with neutral terms and piecewise constant arguments (NPRNN) with uncertain connection weights.

Problem Formulation
Main Results
Illustrative Example
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call