Abstract

This paper surveys recent advances related to sparse least mean square (LMS) algorithms. Since standard LMS algorithm does not take advantage of the sparsity information about the channel being estimated, various sparse LMS algorithms that are aim at outperforming standard LMS in sparse channel estimation are discussed. Sparse LMS algorithms force the solution to be sparse by introducing a sparse penalty to the standard LMS cost function. Under the reasonable conditions on the training datas and parameters, sparse LMS algorithms are shown to be mean square stable, and their mean square error performance and convergence rate are better than standard LMS algorithm. We introduce the sparse algorithms under Gaussian noises model. The simulation results presented in this work are useful in comparing sparse LMS algorithms against each other, and in comparing sparse LMS algorithms against standard LMS algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call