Abstract

Together with Lyapunov‐Krasovskii functional theory and reciprocal convex technique, a new sufficient condition is derived to guarantee the global stability for recurrent neural networks with both time‐varying and continuously distributed delays, in which one improved delay‐partitioning technique is employed. The LMI‐based criterion heavily depends on both the upper and lower bounds on state delay and its derivative, which is different from the existent ones and has more application areas as the lower bound of delay derivative is available. Finally, some numerical examples can illustrate the reduced conservatism of the derived results by thinning the delay interval.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.