Abstract

The discrete‐time delayed neural network with complex‐valued linear threshold neurons is considered. By constructing appropriate Lyapunov‐Krasovskii functionals and employing linear matrix inequality technique and analysis method, several new delay‐dependent criteria for checking the boundedness and global exponential stability are established. Illustrated examples are also given to show the effectiveness and less conservatism of the proposed criteria.

Highlights

  • In the past decade, neural networks have received increasing interest owing to their applications in many areas such as signal processing, pattern recognition, associative memories, parallel computation, and optimization solvers 1

  • During the implementation on very large-scale integrated chips, transmitting time delays will destroy the dynamical behaviors of neural networks

  • Some important results on the boundedness, convergence, global exponential stability, synchronization, state estimation, Discrete Dynamics in Nature and Society and passivity analysis have been reported for delayed neural networks; see 1–9 and the references theorems for some recent publications

Read more

Summary

Introduction

Neural networks have received increasing interest owing to their applications in many areas such as signal processing, pattern recognition, associative memories, parallel computation, and optimization solvers 1 In such applications, the qualitative analysis of the dynamical behaviors is a necessary step for the practical design of neural networks 2. Authors considered a class of discrete time recurrent neural networks with complex-valued weights and activation function 22, 23. In 23 , the boundedness, global attractivity, and complete stability were investigated for discrete-time recurrent neural networks with complex-valued linear threshold neurons. Motivated by the above discussions, the objective of this paper is to study the problem on boundedness and stability of discrete-time delayed neural network with complex-valued linear threshold neurons

Model Description and Preliminaries
Let us define vi k
The Main Results and Their Proofs
Re αi hi
12 It is easy to prove that
Examples
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.