Abstract

We prove an exponential decay concentration inequality to bound the tail probability of the difference between the log-likelihood of discrete random variables on a finite alphabet and the negative entropy. The concentration bound we derive holds uniformly over all parameter values. The new result improves the convergence rate in an earlier result of Zhao (2020), from (K2logK)∕n=o(1) to (logK)2∕n=o(1), where n is the sample size and K is the size of the alphabet. We further prove that the rate (logK)2∕n=o(1) is optimal. The result is extended to misspecified log-likelihoods for grouped random variables. We give applications of the new result in information theory.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call