Abstract

Sparsity phenomena in learning processes have been extensively studied, since their detection allows to derive suited regularized optimization algorithms capable of improving the overall learning performance. In this paper, we investigate the sparsity behavior that may occur in nonlinear adaptive filtering problems and how to leverage it and develop enhanced algorithms. In particular, we focus on a particular class of linear-in-the-parameters nonlinear adaptive filters, whose nonlinear transformation is based on a functional link expansion. The analysis of the sparsity in functional links leads us to derive a family of adaptive combined filtering architectures that is capable of exploiting any sparseness degree in the nonlinear filtering. We propose two different filtering schemes based on a new block-based combined approach, well suited for sparse adaptive algorithms. Moreover, a hierarchical architecture is also proposed that generalizes the different combined schemes and does not need any a priori information about the nature of the nonlinearity to be modeled. Experimental results prove the effectiveness of the proposed combined architectures in exploiting any sparseness degree and improving the modeling performance in nonlinear system identification problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call