Abstract

Let $(X_1,\ldots,X_n)$ be an i.i.d. sequence of random variables in $\mathbb{R}^d$, $d\geq 1$. We show that, for any function $\varphi :\mathbb{R}^d\rightarrow\mathbb{R}$, under regularity conditions, \[n^ {1/2}\Biggl(n^{-1}\sum_{i=1}^n\frac{\varphi(X_i)}{\widehat{f}^(X_i)}- \int \varphi(x)\,dx\Biggr)\stackrel{\mathbb{P}}{\longrightarrow}0,\] where $\widehat{f}$ is the classical kernel estimator of the density of $X_1$. This result is striking because it speeds up traditional rates, in root $n$, derived from the central limit theorem when $\widehat{f}=f$. Although this paper highlights some applications, we mainly address theoretical issues related to the later result. We derive upper bounds for the rate of convergence in probability. These bounds depend on the regularity of the functions $\varphi$ and $f$, the dimension $d$ and the bandwidth of the kernel estimator $\widehat{f}$. Moreover, they are shown to be accurate since they are used as renormalizing sequences in two central limit theorems each reflecting different degrees of smoothness of $\varphi$. As an application to regression modelling with random design, we provide the asymptotic normality of the estimation of the linear functionals of a regression function. As a consequence of the above result, the asymptotic variance does not depend on the regression function. Finally, we debate the choice of the bandwidth for integral approximation and we highlight the good behavior of our procedure through simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call