Abstract

Mutual information (MI) can be viewed as a measure of multivariate association in a random vector. However, the estimation of MI is difficult since the estimation of the joint probability density function (PDF) of non Gaussian distributed data is a hard problem. Copula function is an appropriate tool for estimating MI since the joint probability density function ofrandom variables can be expressed as the product of the associated copula density function and marginal PDF’s. With a little search, we find that the proposed copulas-based mutual information is much more accurate than conventional methods such as the joint histogram and Parzen window-based MI. In this paper, by using the copulas-based method, we compute MI forsome family of bivariate distribution functions and study the relationship between Kendall’s tau correlation and MI of bivariate distributions. Finally, using a real dataset, we illustrate the efficiency of this approach.

Highlights

  • IntroductionOne way of determining the measure of dependence between two random variables is using the information theory

  • One way of determining the measure of dependence between two random variables is using the information theory. Some measures such as entropy, mutual information, and quadratic mutual information play an important role in dependence measuring of bivariate distributions and some papers have written in this subject

  • Note that the copulas-based mutual information only relies on the copula density function that is determined by the copula parameter, and only the copula parameter is required for estimation of Mutual information (MI)

Read more

Summary

Introduction

One way of determining the measure of dependence between two random variables is using the information theory. The mutual information ( known as Kullback-Leibler divergence) is a general measure of the dependence between two random variables. In digital image processing, three approaches: specific multivariate distribution assumption such as multivariate Gaussian distribution, the joint histogram (Maes, Collignon, Vandermeulen, Marchal & Suetens 1997) and the Parzen window (Kwak & Choi 2002) are usually used to estimate MI. Revista Colombiana de Estadística 43 (2020) 3–20 and parameter for kernel function are difficult to confirm Both the joint histogram and the Parzen window cannot estimate the continuous form of the joint probability density function. We estimate MI for some family of bivariate distribution functions by using copulas-based mutual information, but for calculating we use numerical integration. We use the R software (R Development Core Team 2012) and for numerical integration, we apply the R package “cubature”

Copulas-Based Mutual Information
Mutual Information in Some Family of Bivariate Distribution
Cuadras-Auge Copula
Clayton Copula
Frank Copula
Gumbel Copula
Raftery Copula
Gaussian Copula
T-Copulas
Comparing mutual information
Data Analysis
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call