Abstract

In this paper, we propose a type of sparsity-aware set-membership normalized least mean square (SM-NLMS) algorithm for sparse channel estimation and echo cancelation. The proposed algorithm incorporates an l1-norm penalty into the cost function of the conventional SM-NLMS algorithm to exploit the sparsity of the sparse systems, which is denoted as zero-attracting SM-NLMS (ZASM-NLMS) algorithm. Furthermore, an improved ZASM-NLMS algorithm is also derived by using a log-sum function instead of the l1-norm penalty in the ZASM-NLMS, which is denoted as reweighted ZASM-NLMS (RZASM-NLMS) algorithm. These zero-attracting SM-NLMS algorithms are equivalent to adding shrinkages in their update equations, which result in fast convergence speed and low estimation error when most of the unknown channel coefficients are zero or close to zero. These proposed algorithms are described and analyzed in detail, while the performances of these algorithms are investigated by using computer simulations. The simulation results obtained from sparse channel estimation and echo cancelation demonstrate that the proposed sparse SM-NLMS algorithms are superior to the previously proposed NLMS, SM-NLMS as well as zero-attracting NLMS (ZA-NLMS) algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.