Abstract

Estimation of the differential entropy from observations of a random variable is of great importance for a wide range of signal processing applications such as source coding, pattern recognition, hypothesis testing, and blind source separation. In this paper, we present a method for estimation of the Shannon differential entropy that accounts for embedded manifolds. The method is based on high-rate quantization theory and forms an extension of the classical nearest-neighbor entropy estimator. The estimator is consistent in the mean square sense and an upper bound on the rate of convergence of the estimator is given. Because of the close connection between compression and Shannon entropy, the proposed method has an advantage over methods estimating the Renyi entropy. Through experiments on uniformly distributed data on known manifolds and real-world speech data we show the accuracy and usefulness of our proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call