Abstract

The proportionate updating (PU) and zero-attracting (ZA) mechanisms have been applied independently in the development of sparsity-aware recursive least squares (RLS) algorithms. Recently, we propose an enhanced l1-proportionate RLS (l1-PRLS) algorithm by combining the PU and ZA mechanisms. The l1-PRLS employs a fixed step size which trades off the transient (initial convergence) and steady-state performance. In this letter, the l1-PRLS is improved in two aspects: first, we replace the l1 norm penalty by a general convex regularization (CR) function to have the CR-PRLS algorithm; second, we further introduce the variable step-size (VSS) technique to the CR-PRLS, leading to the VSS-CR-PRLS algorithm. Theoretical and numerical results were provided to corroborate the superiority of the improved algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call