Abstract

This paper is concerned with the problem of local and global asymptotic stability for a class of discrete-time recurrent neural networks, which provide discrete-time analogs to their continuous-time counterparts, i.e., continuous-time recurrent neural networks with distributed delay. Some stability criteria, which include some existing results as their special cases, are derived. A discussion about the dynamical consistence of discrete-time neural networks versus their continuous-time counterparts is provided. An unconventional finite difference method is proposed and an example is also given to show the effectiveness of the method.

Highlights

  • I N standard neural networks theory, neural activity is described in terms of rates

  • First, we will model discrete-time analogs of systems (6) and (7). These discrete-time analogs can be regarded as stand alone discrete-time recurrent neural networks with discrete-time distributed delay

  • The systems (12) and (13) that we have studied in previous sections can be considered as stand alone discrete-time neural networks or as discrete-time analogs of their continuous-time counterparts

Read more

Summary

INTRODUCTION

I N standard neural networks theory, neural activity is described in terms of rates. The rate of neural is an analog variable which nonlinearly depends upon the excitation of the neuron (1). The expression (3) is a static equation It applies to the situa- synaptic filter has finite memory and we have tion where a stationary input (a set of firing rates ) is mapped to a stationary output (the rate ). For continuous-time recurrent neural networks, there are many results available in the literature. A discrete-time recurrent neural network, which is a discrete-time analog of system (4), has been studied and some criteria for global stability of the equilibrium have been given in [10], [12], and [13]. First, we will model discrete-time analogs of systems (6) and (7) These discrete-time analogs can be regarded as stand alone discrete-time recurrent neural networks with discrete-time distributed delay. We will give some concluding remarks of the results

DISCRETE-TIME RECURRENT NEURAL NETWORKS
LINEAR APPROXIMATION AND LOCAL STABILITY
GLOBAL STABILITY
DISCRETE-TIME NEURAL NETWORKS VERSUS THEIR CONTINUOUS-TIME COUNTERPARTS
CONCLUDING REMARKS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.