Abstract

We consider the problem of minimax estimation of the entropy of a density over Lipschitz balls. Dropping the usual assumption that the density is bounded away from zero, we obtain the minimax rates $(n\ln n)^{-s/(s+d)}+n^{-1/2}$ for $0<s\leq 2$ for densities supported on $[0,1]^{d}$, where $s$ is the smoothness parameter and $n$ is the number of independent samples. We generalize the results to densities with unbounded support: given an Orlicz functions $\Psi $ of rapid growth (such as the subexponential and sub-Gaussian classes), the minimax rates for densities with bounded $\Psi $-Orlicz norm increase to $(n\ln n)^{-s/(s+d)}(\Psi ^{-1}(n))^{d(1-d/p(s+d))}+n^{-1/2}$, where $p$ is the norm parameter in the Lipschitz ball. We also show that the integral-form plug-in estimators with kernel density estimates fail to achieve the minimax rates, and characterize their worst case performances over the Lipschitz ball. One of the key steps in analyzing the bias relies on a novel application of the Hardy–Littlewood maximal inequality, which also leads to a new inequality on the Fisher information that may be of independent interest.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call