Abstract

The minimum error entropy (MEE) criterion has been receiving increasing attention due to its promising perspectives for applications in signal processing and machine learning. In the context of Bayesian estimation, the MEE criterion is concerned with the estimation of a certain random variable based on another random variable, so that the error’s entropy is minimized. Several theoretical results on this topic have been reported. In this work, we present some further results on the MEE estimation. The contributions are twofold: (1) we extend a recent result on the minimum entropy of a mixture of unimodal and symmetric distributions to a more general case, and prove that if the conditional distributions are generalized uniformly dominated (GUD), the dominant alignment will be the MEE estimator; (2) we show by examples that the MEE estimator (not limited to singular cases) may be non-unique even if the error distribution is restricted to zero-mean (unbiased).

Highlights

  • A central concept in information theory is entropy, which is a mathematical measure of the uncertainty or the amount of missing information [1]

  • In [20], it has been shown that, for the singular case, the unbiased minimum error entropy (MEE) estimation may yield non-unique solutions

  • Two issues involved in the minimum error entropy (MEE) estimation have been studied in this work

Read more

Summary

Introduction

A central concept in information theory is entropy, which is a mathematical measure of the uncertainty or the amount of missing information [1]. The maximum entropy principle is a powerful and widely accepted method for statistical inference or probabilistic reasoning with incomplete knowledge of probability distribution [2] Another important entropy principle is the minimum entropy principle, which decreases the uncertainty associated with a system. Chen et al have investigated the robustness, non-uniqueness (for singular cases), sufficient condition, and the necessary condition involved in the MEE estimation [20]. We extend the results of Chen and Geman to a more general case, and show that when the conditional PDFs are generalized uniformly dominated (GUD), the MEE estimator equals the dominant alignment.

MEE Estimator for Generalized Uniformly Dominated Conditional Distributions
Non-Uniqueness of Unbiased MEE Estimation
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call