Abstract

The exponential stability issue for a class of stochastic neural networks (SNNs) with Markovian jump parameters, mixed time delays, andα-inverse Hölder activation functions is investigated. The jumping parameters are modeled as a continuous-time finite-state Markov chain. Firstly, based on Brouwer degree properties, the existence and uniqueness of the equilibrium point for SNNs without noise perturbations are proved. Secondly, by applying the Lyapunov-Krasovskii functional approach, stochastic analysis theory, and linear matrix inequality (LMI) technique, new delay-dependent sufficient criteria are achieved in terms of LMIs to ensure the SNNs with noise perturbations to be globally exponentially stable in the mean square. Finally, two simulation examples are provided to demonstrate the validity of the theoretical results.

Highlights

  • In the past few decades, there has been increasing interest in different classes of neural networks such as Hopfield, cellular, Cohen-Grossberg, and bidirectional associative neural networks due to their potential applications in many areas such as classification, signal and image processing, parallel computing, associate memories, optimization, and cryptography [1,2,3,4,5,6]

  • It should be noted that all the results reported in the literature above are concerned with Markovian jumping stochastic neural networks (SNNs) with Lipschitz neuron activation functions

  • We have dealt with the global exponential stability issue for a class of stochastic neural networks with αinverse Hölder activation functions, Markovian jump parameters, and mixed time delays

Read more

Summary

Introduction

In the past few decades, there has been increasing interest in different classes of neural networks such as Hopfield, cellular, Cohen-Grossberg, and bidirectional associative neural networks due to their potential applications in many areas such as classification, signal and image processing, parallel computing, associate memories, optimization, and cryptography [1,2,3,4,5,6]. To solve problems of optimization, neural control, signal processing, and so forth, neural networks have to be designed in such a way that, for a given external input, they exhibit only one globally asymptotically/exponentially stable equilibrium point. As pointed out in [20], in real nervous systems, and in the implementation of artificial neural networks, synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes; noise is unavoidable and should be taken into consideration in modeling. The stochastic stability of various neural networks with or without delays under noise disturbance has received extensive attention from a lot of scholars in recent years, and some results related to this issue have been reported in the literature; see [23,24,25,26,27,28,29,30,31]

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call