Abstract
The variable step-size least-mean-square algorithm (VSSLMS) is an enhanced version of the least-mean-square algorithm (LMS) that aims at improving both convergence rate and mean-square error. The VSSLMS algorithm, just like other popular adaptive methods such as recursive least squares and Kalman filter, is not able to exploit the system sparsity. The zero-attracting variable step-size LMS (ZA-VSSLMS) algorithm was proposed to improve the performance of the variable step-size LMS (VSSLMS) algorithm for system identification when the system is sparse. It combines the $${\ell _1}$$ -norm penalty function with the original cost function of the VSSLMS to exploit the sparsity of the system. In this paper, we present the convergence and stability analysis of the ZA-VSSLMS algorithm. The performance of the ZA-VSSLMS is compared to those of the standard LMS, VSSLMS, and ZA-LMS algorithms in a sparse system identification setting.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.