Abstract

Entropy has been widely employed as a measure of variability for problems, such as machine learning and signal processing. In this paper, we provide some new insights into the behaviors of entropy as a measure of multivariate variability. The relationships between multivariate entropy (joint or total marginal) and traditional measures of multivariate variability, such as total dispersion and generalized variance, are investigated. It is shown that for the jointly Gaussian case, the joint entropy (or entropy power) is equivalent to the generalized variance, while total marginal entropy is equivalent to the geometric mean of the marginal variances and total marginal entropy power is equivalent to the total dispersion. The smoothed multivariate entropy (joint or total marginal) and the kernel density estimation (KDE)-based entropy estimator (with finite samples) are also studied, which, under certain conditions, will be approximately equivalent to the total dispersion (or a total dispersion estimator), regardless of the data distribution.

Highlights

  • The concept of entropy can be used to quantify uncertainty, complexity, randomness, and regularity [1,2,3,4]

  • We show that for the jointly Gaussian case, the joint entropy and joint entropy power are equivalent to the generalized variance, while total marginal entropy is equivalent to the geometric mean of the marginal variances and total marginal entropy power is equivalent to the total dispersion

  • From Theorem 2 we find that, for the jointly Gaussian case, Renyi’s joint entropy Hα pXq is equivalent to the generalized variance |Σ|, and the order-α total marginal entropy Tα pXq is equivalent to the geometric mean of the d marginal variances

Read more

Summary

Introduction

The concept of entropy can be used to quantify uncertainty, complexity, randomness, and regularity [1,2,3,4]. The total dispersion (i.e., the trace of the covariance matrix) and generalized variance (i.e., the determinant of the covariance matrix) are two widely used measures of multivariate variability, both have some limitations [18,19,20]. These measures of multivariate variability involve only second-order statistics and cannot describe well non-Gaussian distributions. We show that with finite number of samples, the kernel density estimation (KDE) based entropy (joint or total marginal) estimator will be approximately equivalent to a total dispersion estimator if the kernel function is Gaussian with covariance matrix being an identity matrix and the smoothing factor is large enough.

Shannon’s Entropy
Entropy Powers
Smoothed Multivariate Entropy Measures
Tr pJpZqΣ X q t 2 logtoptq?
Multivariate
N marginal
Consider drawn from afftwo-dimensional
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.