Abstract

AbstractSince the convergence of neural networks depends on learning rates, the learning rates of training algorithm for neural networks are very important factors. Therefore, we propose the Adaptive Learning Rates(ALRs) of Extended Kalman Filter(EKF) based training algorithm for wavelet neural networks(WNNs). The ALRs of the EFK based training algorithm produce the convergence of the WNN. Also we derive the convergence analysis of the learning process from the discrete Lyapunov stability theorem. Several simulation results show that the EKF based WNN with ALRs adapt to abrupt change and high nonlinearity with satisfactory performance.KeywordsExtend Kalman FilterAsymptotic ConvergenceWavelet Neural NetworkPast OutputAdaptive Learn RateThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call