Abstract

For the Hopfield Neural Network problem we consider unbounded monotone nondecreasing activation functions. We prove convergence to zero in an exponential manner provided that we start with sufficiently small initial data.

Highlights

  • Of concern is the following system: xi (t) = − ai(t)xi(t) m (1)+ bij(t) fj xj(t) + ci(t), i = 1, . . . , m, j=1 where ai(t) ≥ 0, bij(t),ci(t), i, j = 1, . . . , m are continuous functions, and fj are the activation functions which will be assumed continuous and bounded by some nondecreasing.This system appears in Neural Network theory [1, 2]

  • As is well-known, Neural Networks are an important tool in business intelligence

  • Their architecture differs from the one of standard computers in that it consists of a large number of processors with high connections between them

Read more

Summary

Introduction

M are continuous functions, and fj are the activation functions which will be assumed continuous and bounded by some nondecreasing (and possibly unbounded functions) This system appears in Neural Network theory [1, 2]. A lot of efforts are devoted in improving the set of conditions on the different coefficients involved in the system as well as the class of activation functions. Regarding the latter issue, the early assumptions of boundedness, monotonicity, and differentiability have been all relaxed to merely a global Lipschitz condition. The section contains the statement and proof of our result as well as a crucial lemma we will be using

Exponential Convergence
Applications
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call