Abstract

In this work, we perform an analysis, in the context of channel equalization, of two criteria that can be considered central to the field of information theoretic learning (ITL): the minimum error entropy criterion (MEEC) and the maximum correntropy criterion (MCC). An original derivation of the exact cost function of these criteria in the scenario of interest is provided and used to analyze their robustness and efficiency from a number of relevant standpoints. Another important feature of the paper is an study of the estimated versions of these cost functions, which raises several aspects regarding parameters of the canonical Parzen window estimator. The study is carried out for distinct channel and noise models, both in the combined response and parameter spaces, and also employs as benchmarks crucial metrics like the probability of bit error. The conclusions indicate under what conditions ITL criteria are particularly reliable and a number of factors that can lead to suboptimal performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call