Abstract

This paper examines a passivity analysis for a class of discrete-time recurrent neural networks (DRNNs) with norm-bounded time-varying parameter uncertainties and interval time-varying delay. The activation functions are assumed to be globally Lipschitz continuous. Based on an appropriate type of Lyapunov functional, sufficient passivity conditions for the DRNNs are derived in terms of a family of linear matrix inequalities (LMIs). Two numerical examples are given to illustrate the effectiveness and applicability.

Highlights

  • Recurrent neural networks have been extensively studied in the past decades

  • Increasing attention has been draw to the potential applications of recurrent neural networks in information processing systems such as signal processing, model identification, optimization, pattern recognition, and associative memory

  • This study has investigated the problem of globally robust passivity conditions for a discretetime recurrent uncertain neural network with interval time-varying delay

Read more

Summary

Introduction

Recurrent neural networks have been extensively studied in the past decades. Two popular examples are Hopfield neural networks and cellular neural networks. Increasing attention has been draw to the potential applications of recurrent neural networks in information processing systems such as signal processing, model identification, optimization, pattern recognition, and associative memory These successful applications are greatly dependent on the dynamic behavior of recurrent neural networks RNNs. On the other hand, time delay is inevitably encountered in RNNs, since the interactions between different neurons are asynchronous. Under different assumptions on the activation functions, a unified linear matrix inequality LMI approach has been developed to establish sufficient conditions for the discrete-time recurrent neural networks with interval variable time to be globally exponentially stable in 22. The purpose of this paper is to deal with the problem of passivity conditions for discrete-time uncertain recurrent neural networks with interval time-varying delay. Throughout this paper, the notation X ≥ Y X > Y for symmetric matrices X and Y indicates that the matrix X − Y is positive and semidefinite resp., positive definite ; ZT represents the transpose of matrix Z

Preliminaries
Mathematical Formulation of the Proposed Approach
Examples
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.