Abstract

In the regression model with errors in variables, we observe $n$ i.i.d. copies of $(Y,Z)$ satisfying $Y=f_{\theta^0}(X)+\xi$ and $Z=X+\epsilon$ involving independent and unobserved random variables $X,\xi,\epsilon$ plus a regression function $f_{\theta^0}$, known up to a finite dimensional $\theta^0$. The common densities of the $X_i$'s and of the $\xi_i$'s are unknown, whereas the distribution of $\epsilon$ is completely known. We aim at estimating the parameter $\theta^0$ by using the observations $(Y_1,Z_1),...,(Y_n,Z_n)$. We propose an estimation procedure based on the least square criterion $\tilde{S}_{\theta^0,g}(\theta)=\m athbb{E}_{\theta^0,g}[((Y-f_{\theta}(X))^2w(X)]$ where $w$ is a weight function to be chosen. We propose an estimator and derive an upper bound for its risk that depends on the smoothness of the errors density $p_{\epsilon}$ and on the smoothness properties of $w(x)f_{\theta}(x)$. Furthermore, we give sufficient conditions that ensure that the parametric rate of convergence is achieved. We provide practical recipes for the choice of $w$ in the case of nonlinear regression functions which are smooth on pieces allowing to gain in the order of the rate of convergence, up to the parametric rate in some cases. We also consider extensions of the estimation procedure, in particular, when a choice of $w_{\theta}$ depending on $\theta$ would be more appropriate.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call