Abstract

A general class of discrete-time recurrent neural networks (DTRNNs) is formulated and studied in this paper. Several sufficient conditions are obtained to ensure the global stability of DTRNNs with delays based on induction principle (not based on the well-known Liapunov methods). The obtained results have neither assumed the symmetry of the connection matrix, nor boundedness, monotonicity or the differentiability of the activation functions. In addition, discrete-time analogues of a general class of continuous-time recurrent neural networks (CTRNNs) are derived and studied. The convergence characteristics of CTRNNs are preserved by the discrete-time analogues without any restriction imposed on the uniform discretization step size. Finally, the simulating results demonstrate the validity and feasibility of our proposed approach.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call