Abstract

In recent years, research on information theoretic learning (ITL) criteria has become very popular and ITL concepts are widely exploited in several applications because of their robust properties in the presence of heavy-tailed noise distributions. Minimum error entropy with fiducial points (MEEF), as one of the ITL criteria, has not yet been well investigated in the literature. In this study, we suggest a new fixed-point MEEF (FP-MEEF) algorithm, and analyze its convergence based on Banach’s theorem (contraction mapping theorem). Also, we discuss in detail the convergence rate of the proposed method, which is able to converge to the optimal solution quadratically with the appropriate selection of the kernel size. Numerical results confirm our theoretical analysis and also show the outperformance of FP-MEEF in comparison with FP-MSE in some non-Gaussian environments. In addition, the convergence rate of FP-MEEF and gradient descent-based MEEF is evaluated in some numerical examples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call