Abstract

In this brief, we develop a least mean square (LMS) algorithm that converges in a statistical sense to the global minimum of the mean square error (MSE) objective function. This is accomplished by estimating the gradient as a smoothed version of the MSE, The smoothed MSE objective begins as a convex functional in the mean. The amount of dispersion or smoothing is reduced, such that over time it becomes the true MSE as the algorithm converges to the global minimum. We show that this smoothing behavior is approximated by appending a variable noise source to the infinite impulse response (IIR)-LMS algorithm. We show, experimentally, that the proposed method does converge to the global minimum in the cases tested. A performance improvement over the IIR-LMS algorithm and the Steiglitz-McBride algorithm has been achieved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call