Abstract
To solve matrix-type linear time-varying equation more efficiently, a novel exponential-type varying gain recurrent neural network (EVG-RNN) is proposed in this paper. Being distinguished from the traditional fixed-parameter gain recurrent neural network (FG-RNN), the proposed EVG-RNN is derived from a vector- or matrix-based unbounded error function by a varying-parameter neural dynamic approach. With four different kinds of activation functions, the super-exponential convergence performance of EVG-RNN is proved theoretically in details, of which the error convergence rate is much faster than that of FG-RNN. In addition, mathematics proves that the computation errors of EVG-RNN can converge to zero, and it possesses the capability of restraining external interference. Finally, series of computer simulations verify and illustrate the better performance of convergence and robustness of EVG-RNN than that of FG-RNN and FTZNN when solving the identical linear time-varying equation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.