Abstract

This article investigates global asymptotic stability for neural networks (NNs) with time-varying delay, which is differentiable and uniformly bounded, and the delay derivative exists and is upper-bounded. First, we propose the extended secondary delay partitioning technique to construct the novel Lyapunov-Krasovskii functional, where both single-integral and double-integral state variables are considered, while the single-integral ones are only solved by the traditional secondary delay partitioning. Second, a novel free-weight matrix equality (FWME) is presented to resolve the reciprocal convex combination problem equivalently and directly without Schur complement, which eliminates the need of positive definite matrices, and is less conservative and restrictive compared with various improved reciprocal convex inequalities. Furthermore, by the present extended secondary delay partitioning, equivalent reciprocal convex combination technique, and Bessel-Legendre inequality, two different relaxed sufficient conditions ensuring global asymptotic stability for NNs are obtained, for time-varying delays, respectively, with unknown and known lower bounds of the delay derivative. Finally, two examples are given to illustrate the superiority and effectiveness of the presented method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call