Abstract

In this paper, we examine a Bidirectional Associative Memory neural network model with distributed delays. Using a result due to Cid [J. Math. Anal. Appl.281(2003) 264–275], we were able to prove an exponential stability result in the case when the standard Lipschitz continuity condition is violated. Indeed, we deal with activation functions which may not be Lipschitz continuous. Therefore, the standard Halanay inequality is not applicable. We will use a nonlinear version of this inequality. At the end, the obtained differential inequality which should imply the exponential stability appears ‘state dependent’. That is the usual constant depends in this case on the state itself. This adds some difficulties which we overcome by a suitable argument.

Highlights

  • Artificial neural networks have received a lot of attention in sciences and engineering, in economics, biology, medicine, gas and petroleum industry [1, 25, 40] and in many other disciplines

  • In [18, 38, 39], a set of sufficient conditions based on the system parameters guaranteeing the exponential stability of various retarded bidirectional associative memory (BAM) neural network models was derived by analytical techniques and Lyapunov functionals

  • Motivated by the discussions above, in this paper, we examine the exponential stability of a BAM neural network model with distributed delays

Read more

Summary

Introduction

Artificial neural networks have received a lot of attention in sciences and engineering, in economics, biology, medicine, gas and petroleum industry [1, 25, 40] and in many other disciplines. In [18, 38, 39], a set of sufficient conditions based on the system parameters guaranteeing the exponential stability of various retarded BAM neural network models was derived by analytical techniques and Lyapunov functionals. The authors in [5, 10, 17, 20, 23] obtained some LMI-dependent sufficient conditions ensuring either the exponential or asymptotic stability of BAM neural networks involving delays, via Karasovski Lyapunov functionals and analytical inequalities. The activation function of hidden neurons introduces a degree of nonlinearity that is of significant value in most applications of artificial neural networks.

Model description and preliminaries
Exponential stability
The existence and uniqueness may be deduced from the following argument t1
Numerical illustration
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call