Abstract
In recent years, research on information theoretic learning (ITL) criteria has become very popular and ITL concepts are widely exploited in several applications because of their robust properties in the presence of heavy-tailed noise distributions. Minimum error entropy with fiducial points (MEEF), as one of the ITL criteria, has not yet been well investigated in the literature. In this study, we suggest a new fixed-point MEEF (FP-MEEF) algorithm, and analyze its convergence based on Banach’s theorem (contraction mapping theorem). Also, we discuss in detail the convergence rate of the proposed method, which is able to converge to the optimal solution quadratically with the appropriate selection of the kernel size. Numerical results confirm our theoretical analysis and also show the outperformance of FP-MEEF in comparison with FP-MSE in some non-Gaussian environments. In addition, the convergence rate of FP-MEEF and gradient descent-based MEEF is evaluated in some numerical examples.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.