Abstract

In this paper, we propose adaptive algorithms for system identification of sparse systems. We introduce a L1-norm penalty to improve the performance of affine projection algorithms. This strategy results in two new algorithms, the zero- attracting APA (ZA-APA) and the reweighted zero-attracting AP (RZA-APA). The ZA-APA is derived via the combination of a L1-norm penalty on the coefficients into a standard APA cost function, which generates a zero attractor in the update function. The zero attractor promotes sparsity in the filter coefficients during the update process, and therefore accelerates convergence when identifying sparse systems. We show that the ZA-APA can achieve a lower mean square error than the standard LMS and AP algorithms. To further improve the performance, the RZA-APA is developed using a reweighted zero attractor. The performance of the RZA-APA is superior to that of the ZA-APA numerically. Simulation results demonstrate the advantages of the proposed adaptive algorithms in both convergence rate and steady-state behavior under sparsity assumptions on the true coefficient vector. The RZA-APA is also shown to be robust when the number of non-zero taps increases. (5 pages)

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call