Abstract
This paper surveys recent advances related to sparse least mean square (LMS) algorithms. Since standard LMS algorithm does not take advantage of the sparsity information about the channel being estimated, various sparse LMS algorithms that are aim at outperforming standard LMS in sparse channel estimation are discussed. Sparse LMS algorithms force the solution to be sparse by introducing a sparse penalty to the standard LMS cost function. Under the reasonable conditions on the training datas and parameters, sparse LMS algorithms are shown to be mean square stable, and their mean square error performance and convergence rate are better than standard LMS algorithm. We introduce the sparse algorithms under Gaussian noises model. The simulation results presented in this work are useful in comparing sparse LMS algorithms against each other, and in comparing sparse LMS algorithms against standard LMS algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.