Abstract

Abstract We study the relative entropy between the empirical estimate of a discrete distribution and the true underlying distribution. If the minimum value of the probability mass function exceeds an $\alpha> 0$ (i.e. when the true underlying distribution is bounded sufficiently away from the boundary of the simplex), we prove an upper bound on the moment generating function of the centred relative entropy that matches (up to logarithmic factors in the alphabet size and $\alpha $) the optimal asymptotic rates, subsequently leading to a sharp concentration inequality for the centred relative entropy. As a corollary of this result we also obtain confidence intervals and moment bounds for the centred relative entropy that are sharp up to logarithmic factors in the alphabet size and $\alpha $.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.