Abstract

The past decade has seen the rapid development of information theoretic learning and its applications in signal processing and machine learning. Specifically, minimum error entropy (MEE) and maximum correntropy criterion (MCC) have been widely studied in the literature. Although MEE and MCC are applied in many branches of knowledge and could outperform statistical criteria (such as mean square error), they have not been compared with each other from theoretical point of view. In some cases, MEE and MCC perform similarly to each other; however, under some conditions (e.g., in non-Gaussian environments), they act differently. This letter derives a new information theoretic relation between MEE and MCC, leading to better understanding of the theoretical differences, and illustrates the findings in a common example.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call