Abstract
For multivariate regressors, integrating the Nadaraya–Watson regression smoother produces estimators of the lower-dimensional marginal components that are asymptotically normally distributed, at the optimal rate of convergence. Some heuristics, based on consistency of the pilot estimator, suggested that the estimator would not converge at the optimal rate of convergence in the presence of more than four covariates. This paper shows first that marginal integration with its internally normalized counterpart leads to rate-optimal estimators of the marginal components. We introduce the necessary modifications and give central limit theorems. Then, it is shown that the method apply also to more general models, in particular we discuss feasible estimation of partial linear models. The proofs reveal that the pilot estimator shall over-smooth the variables to be integrated, and, that the resulting estimator is itself a lower-dimensional regression smoother. Hence, finite sample properties of the estimator are comparable to those of low-dimensional nonparametric regression. Further advantages when starting with the internally normalized pilot estimator are its computational attractiveness and better performance (compared to its classical counterpart) when the covatiates are correlated and nonuniformly distributed. Simulation studies underline the excellent performance in comparison with so far known methods.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.