Abstract

The problem of estimating the Kullback–Leibler divergence $D(P\|Q)$ between two unknown distributions $P$ and $Q$ is studied, under the assumption that the alphabet size $k$ of the distributions can scale to infinity. The estimation is based on $m$ independent samples drawn from $P$ and $n$ independent samples drawn from $Q$ . It is first shown that there does not exist any consistent estimator that guarantees asymptotically small worst case quadratic risk over the set of all pairs of distributions. A restricted set that contains pairs of distributions, with density ratio bounded by a function $f(k)$ is further considered. An augmented plug-in estimator is proposed, and its worst case quadratic risk is shown to be within a constant factor of $(({k}/{m})+({kf(k)}/{n}))^{2}+({\log ^{2}\!\!f(k)}/{m})+({f(k)}/{n})$ , if $m$ and $n$ exceed a constant factor of $k$ and $kf(k)$ , respectively. Moreover, the minimax quadratic risk is characterized to be within a constant factor of $((k/(m \log k))+(k f(k)/(n \log k)))^{2}+({\log ^{2}\!\!f(k)}/{m})+({f(k)}/{n})$ , if $m$ and $n$ exceed a constant factor of $k/\log (k)$ and $kf(k)/\log k$ , respectively. The lower bound on the minimax quadratic risk is characterized by employing a generalized Le Cam’s method. A minimax optimal estimator is then constructed by employing both the polynomial approximation and the plug-in approaches.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.