Abstract

Information theoretic learning based approaches have been combined with the framework of reproducing kernel Hilbert space (RKHS) based techniques for nonlinear and non-Gaussian signal processing applications. In particular, generalized kernel maximum correntropy (GKMC) algorithm has been proposed in the literature which adopts generalized Gaussian probability density function (GPDF) as the cost function in order to train the filter weights. Recently, a more flexible and computationally efficient algorithm called maximum Versoria criterion (MVC) which adopts the generalized Versoria function as the adaptation cost has been proposed in the literature which delivers better performance as compared to the maximum correntropy criterion. In this paper, we propose a novel generalized kernel maximum Versoria criterion (GKMVC) algorithm which combines the advantages of RKHS based approaches and MVC algorithm. Further, a novelty criterion based dictionary sparsification technique as suggested for kernel least mean square (KLMS) algorithm is proposed for GKMVC algorithm for reducing its computational complexity. Furthermore, an analytical upper bound on step-size is also derived in order to ensure the convergence of the proposed algorithm. Simulations are performed over various non-Gaussian noise distributions which indicate that the proposed GKMVC algorithm exhibits superior performance in terms of lower steady-state error floor as compared to the existing algorithms, namely the KLMS and the GKMC algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call