Abstract

The momentum least-mean square (MLMS) algorithm, a modified version of the well-known LMS algorithm, has recently been proposed, and an analysis of its basic convergence properties has been given. The authors revise the ranges of the MLMS algorithm's parameters, for which convergence is guaranteed, and provide precise expressions of convergence rate and steady-state performance of the algorithm under slow learning conditions. As a result, it is shown that, with Gaussian inputs and a low adaptation rate, the LMS and MLMS algorithms are equivalent, but, with inputs incorporating impulse noise components, the MLMS algorithm performs better. Due to its increased inertia, the MLMS algorithm becomes preferable for systems with inputs containing impulse noise components. At the expense of increased computational complexity, the MLMS algorithm is more stable against short-term disturbances exhibited by the filter input. >

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call