Abstract

To solve a general time-varying Sylvester equation, a novel integral recurrent neural network (IRNN) is designed and analyzed. This kind of recurrent neural networks is based on an error-integral design equation and does not need training in advance. The IRNN can achieve global convergence performance and strong robustness if odd-monotonically increasing activation functions [i.e., the linear, bipolar-sigmoid, power, or sigmoid-power activation functions (SP-AFs)] are applied. Specifically, if linear or bipolar-sigmoid activation functions are applied, the IRNN possess exponential convergence performance. The IRNN has finite-time convergence property by using power activation function. To obtain faster convergence performance and finite-time convergence property, an SP-AF is designed. Furthermore, by using the discretization method, the discrete IRNN model and its convergence analysis are also presented. Practical application to robot manipulator and computer simulation results with using different activation functions and design parameters have verified the effectiveness, stability, and reliability of the proposed IRNN.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call