Abstract

In this work, the exponential stability problem of impulsive recurrent neural networks is investigated; discrete time delay, continuously distributed delay and stochastic noise are simultaneously taken into consideration. In order to guarantee the exponential stability of our considered recurrent neural networks, two distinct types of sufficient conditions are derived on the basis of the Lyapunov functional and coefficient of our given system and also to construct a Lyapunov function for a large scale system a novel graph-theoretic approach is considered, which is derived by utilizing the Lyapunov functional as well as graph theory. In this approach a global Lyapunov functional is constructed which is more related to the topological structure of the given system. We present a numerical example and simulation figures to show the effectiveness of our proposed work.

Highlights

  • The differential dynamics model is one of the basic tools in the characterization of natural and engineering processes [7, 14, 19, 22, 40, 41], and it is a basic block of the complicated neural network [18, 65]

  • A simplified mathematical description of natural neural networks is known as the artificial neural networks (ANNs) model

  • The remainder of this work is summarized as follows: In the upcoming section, we present the mathematical description of the impulsive stochastic recurrent neural networks with mixed delays (ISRNNMDs), preliminaries which correspond to the given work, some assumptions and basic notations are given

Read more

Summary

Introduction

The differential dynamics model is one of the basic tools in the characterization of natural and engineering processes [7, 14, 19, 22, 40, 41], and it is a basic block of the complicated neural network [18, 65]. – τi(t))) 0, dωj (t), where di > 0; αij, βij are all positive constants; the activation functions are represented by fj, gj, the discrete time-varying transmission delays τi(t), stochastic noise ωj and the external input is Ii(t). Exponential stability analysis of neural networks with time-varying delays, stochastic, impulsive effects was investigated by many researchers. [29, 45] Raja et al investigate the exponential stability of neural networks by using Lyapunov and linear matrix inequality; in [59] Congcong et al study the exponential stability by using the Lyapunov and impulsive delay differentiable inequality techniques, Li et al [31] investigate exponential stability by using Razumikhin techniques and the Lyapunov functional approach as well as stochastic analysis. The outcome of our study generalizes the one in [39] strictly

Preliminaries
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call