Abstract

It is well-known that the stochastic gradient (SG) identification algorithm has poor convergence rate. In order to improve the convergence rate, we extend the SG algorithm from the viewpoint of innovation modification and present multi-innovation gradient type identification algorithms, including a multi-innovation stochastic gradient (MISG) algorithm and a multi-innovation forgetting gradient (MIFG) algorithm. Because the multi-innovation gradient type algorithms use not only the current data but also the past data at each iteration, parameter estimation accuracy can be improved. Finally, the performance analysis and simulation results show that the proposed MISG and MIFG algorithms have faster convergence rates and better tracking performance than their corresponding SG algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call