Abstract

It is shown that the uniform distance between the distribution function \(F_n^K(h)\) of the usual kernel density estimator (based on an i.i.d. sample from an absolutely continuous law on \({\mathbb{R}}\)) with bandwidth h and the empirical distribution function F n satisfies an exponential inequality. This inequality is used to obtain sharp almost sure rates of convergence of \(\|F_n^K(h_n)-F_n\|_\infty\) under mild conditions on the range of bandwidths h n , including the usual MISE-optimal choices. Another application is a Dvoretzky–Kiefer–Wolfowitz-type inequality for \(\|F_n^{K}(h)-F\|_\infty\) , where F is the true distribution function. The exponential bound is also applied to show that an adaptive estimator can be constructed that efficiently estimates the true distribution function F in sup-norm loss, and, at the same time, estimates the density of F—if it exists (but without assuming it does)—at the best possible rate of convergence over Hölder-balls, again in sup-norm loss.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call