Abstract

In this paper, we propose new channel sparsityaware recursive least square (RLS) algorithms using a sequential update. The developed sparse sequential RLS (S-SEQ-RLS) and filtered-x RLS (S-SEQ-FxRLS) algorithms use a discard function to disregard the coefficients which are close to zero in the weight vector for each channel in order to reduce the computational load and improve the algorithm convergence rate. The developed l0-norm sequential RLS (l0-SEQ-RLS) and filtered-x RLS (l0-SEQ-FxRLS) algorithms minimize the error objective function with a sum of weighted penalty terms in which each penalty term is the l0-norm of the channel weight vector. The channel sparsity-aware algorithms are first derived for nonlinear system modeling and then modified for nonlinear active noise control. Simulation results demonstrate that the proposed channel sparsity-aware RLS algorithms achieve the similar performance as the sequential RLS (SEQ-RLS) algorithm in which the channel weight vectors are sequentially updated. Furthermore, the proposed channel sparsity-aware algorithms require a lower computational load in comparison with the nonsequential sparsity-aware algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call