Abstract

This work extends the study of convergence properties of the Shannon differential entropy, and its connections with the convergence of probability measures in the sense of total variation and direct and reverse information divergence. The results relate the topics of distribution (density) estimation, and Shannon information measures estimation, with special focus on the case of differential entropy. On the application side, this work presents an explicit analysis of the density estimation, and differential entropy estimation, for distributions defined on a finite-dimension Euclidean space (Rd,B(Rd)). New consistency results are derived for several histogram-based estimators: the classical product scheme, the Barron's estimator, one of the approaches proposed by Györfi and Van der Meulen, and the data-driven partition scheme of Lugosi and Nobel.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call