Abstract

With a view to the interference of piecewise constant arguments (PCAs) and neutral terms (NTs) to the original system and the significant applications in the signal transmission process, we explore the robustness of the exponentially global stability (EGS) of recurrent neural network (RNN) with PCAs and NTs (NPRNN). The following challenges arise: what the range of PCAs and the scope of NTs can NPRNN tolerate to be exponentially stable. So we derive two important indicators: maximum interval length of PCAs and the scope of neutral term (NT) compression coefficient here for NPRNN to be exponentially stable. Additionally, we theoretically proved that if the interval length of PCAs and the bound of NT compression coefficient are all lower than the given results herein, the disturbed NPRNN will still remain global exponential stability. Finally, there are two numerical examples to verify the deduced results’ effectiveness here.

Highlights

  • In terms of electrical implementations, such as mutual reciprocity, package modelling, and the electromagnetic interference design for multiple digital computers, pretty responses had been obtained in certain studies [19]. erefore, there are several kinds of reasonable featured methods to research the neutral neural network: the one-step method [20, 21], block boundary value method [22, 23], Euler–Maclaurin method [24], Runge–Kutta method [25], and Legendre multidomain spectral collocation method [26], while there are very few specific ways to study the robustness of the system by giving the supremum of the neutral term compression coefficient affected by other additional distractions at the same time

  • The method of piecewise argument unifies hysteretic and advance, and can be applied to other important systems [23, 34, 35], which undoubtedly immensely improved convenience to deal with the impact of time-lag on most systems. erefore, here we will focus on the discussion of recurrent neural network with piecewise constant arguments and neutral terms (NPRNN) to boost the convenience and completeness of neural networks with NTs (NRNNs)

  • As far as we know, some of the work is about investigating the robustness of hybrid stochastic models, nonlinear models, and recurrent models equipped with neutral terms (NTs) without piecewise constant arguments (PCAs) [27, 28]

Read more

Summary

Introduction

Investigation and synthesis of recurrent neural networks (RNNs) is an unfailing subject regardless of past and present due to its wide application in image and object recognition, speech recognition, model prediction, automatic control, signal processing, and so forth [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18]. Erefore, here we will focus on the discussion of recurrent neural network with piecewise constant arguments and neutral terms (NPRNN) to boost the convenience and completeness of NRNNs. As far as we know, some of the work is about investigating the robustness of hybrid stochastic models, nonlinear models, and recurrent models equipped with NTs without PCAs [27, 28]. In this paper, the interval length of PCA and the bound of NT compression coefficient for the perturbed NPRNN to be exponentially stable are all provided. We will testify theoretically that if the interval range of the PCAs and the scope of NT compression coefficient of the disturbed system are all lower than the upper bounds given in this paper, NPRNNs will still remain stable.

Preliminaries and Notations
Main Results
Numerical Examples
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call