Abstract

For multivariate regressors, integrating the Nadaraya–Watson regression smoother produces estimators of the lower-dimensional marginal components that are asymptotically normally distributed, at the optimal rate of convergence. Some heuristics, based on consistency of the pilot estimator, suggested that the estimator would not converge at the optimal rate of convergence in the presence of more than four covariates. This paper shows first that marginal integration with its internally normalized counterpart leads to rate-optimal estimators of the marginal components. We introduce the necessary modifications and give central limit theorems. Then, it is shown that the method apply also to more general models, in particular we discuss feasible estimation of partial linear models. The proofs reveal that the pilot estimator shall over-smooth the variables to be integrated, and, that the resulting estimator is itself a lower-dimensional regression smoother. Hence, finite sample properties of the estimator are comparable to those of low-dimensional nonparametric regression. Further advantages when starting with the internally normalized pilot estimator are its computational attractiveness and better performance (compared to its classical counterpart) when the covatiates are correlated and nonuniformly distributed. Simulation studies underline the excellent performance in comparison with so far known methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call