Abstract

Mutual information (MI) is a basic concept in information theory. Therefore, estimates of the MI are fundamentally important in most information theory applications. This paper provides a new way of understanding and estimating the MI using the copula function. First, the entropy of the copula, named the copula entropy, is defined as a measure of the dependence uncertainty represented by the copula function and then the MI is shown to be equivalent to the negative copula entropy. With this equivalence, the MI can be estimated by first estimating the empirical copula and then estimating the entropy of the empirical copula. Thus, the MI estimate is an estimation of the entropy, which reduces the complexity and computational requirements. Tests show that the method is more effective than the traditional method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call