Abstract

The least mean square (LMS) algorithm is known to converge in the mean and in the mean square. However, during short time periods, the error sequence can blow up and cause severe disturbances, especially for non-Gaussian processes. The paper discusses potential short time unstable behavior of the LMS algorithm for spherically invariant random processes (SIRP) like Gaussian, Laplacian, and K/sub 0/. The result of this investigation is that the probability for bursting decreases with the step size. However, since a smaller step size also causes a slower convergence rate, one has to choose a tradeoff between convergence speed and the frequence of bursting.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call