The well-known variable step-size least-mean-square (VSSLMS) algorithm provides faster convergence rate while maintaining lower mean-square error than the conventional LMS algorithm. The performance of the VSSLMS algorithm can be improved further in a channel estimation problem if the impulse response of the channel is sparse. Recently, a zero-attracting (ZA)-VSSLMS algorithm was proposed to exploit the sparsity of a channel. This was done by imposing an \(\ell _1\)-norm penalty to the original cost function of the VSSLMS algorithm which utilizes the sparsity in the filter taps during the adaptation process. In this paper, we present the mean-square deviation (MSD) analysis of the ZA-VSSLMS algorithm. A steady-state MSD expression for the ZA-VSSLMS algorithm is derived. An upper bound of the zero-attractor controller (\(\rho \)) that provides the minimum MSD is also provided. Moreover, the effect of the noise distribution on the MSD performance is shown theoretically. It is shown that the theoretical and simulation results of the algorithm are in good agreement with a wide range of parameters, different channel, input signal, and noise types.
Read full abstract