Abstract

Recursive neural networks are generally divided into dynamic neural networks and static neural networks to refer to the neural networks with one or more feedback links in the network structure. Inevitably, there exist some problems such as poor approximation performance and poor stable convergence performance due to complex network structure. The noise-tolerant gradient-oriented neurodynamic (NTGON) model proposed in this study is an improved model based on the traditional idea of a gradient neural network (GNN) model. The proposed NTGON model can obtain accurate and efficient results under the condition of various noises when computing the Sylvester equation, which is effectively used to solve various problems with noise pollution that are frequently encountered in practical engineering. Compared with the original GNN model for the Sylvester equation, the NTGON model exponentially converges to the theoretical solution starting from any initial state. It is demonstrated that the noise-polluted NTGON model converges to the theoretical solution globally no matter how large the unknown matrix-form noise is. Furthermore, simulation results show that the proposed NTGON model achieves a performance that is superior to that of the original GNN model for solving the Sylvester equation in the presence of noise.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.