Abstract

Compared with traditional learning criteria, such as minimum mean square error (MMSE), the minimum error entropy (MEE) criterion has received increasing attention in the domains of nonlinear and non-Gaussian signal processing and machine learning. Since the MEE criterion is shift-invariant, one has to add a bias to achieve zero-mean error over training datasets. Thus, a modification of the MEE called minimization of error entropy with fiducial points (MEEF) was proposed, which controls the bias for MEE in a more elegant and efficient way. In the present paper, we propose a fixed-point minimization of error entropy with fiducial points (MEEF-FP) as an alternative to the gradient based MEEF for training a linear-in-parameters (LIP) model because of its fast convergence speed, robustness and step-size free. Also, we provide a sufficient condition that guarantees the convergence of the MEEF-FP algorithm. Moreover, we develop a recursive MEEF-FP (RMEEF-FP) for online adaptive learning with low-complexity. Finally, illustrative examples are presented to show the excellent performance of the new methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call