Abstract

Abstract We study the relative entropy between the empirical estimate of a discrete distribution and the true underlying distribution. If the minimum value of the probability mass function exceeds an $\alpha> 0$ (i.e. when the true underlying distribution is bounded sufficiently away from the boundary of the simplex), we prove an upper bound on the moment generating function of the centred relative entropy that matches (up to logarithmic factors in the alphabet size and $\alpha $) the optimal asymptotic rates, subsequently leading to a sharp concentration inequality for the centred relative entropy. As a corollary of this result we also obtain confidence intervals and moment bounds for the centred relative entropy that are sharp up to logarithmic factors in the alphabet size and $\alpha $.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call