Abstract

This article is concerned with statistical inference of partially linear additive regression models where the covariates in parametric component are measured with errors. Using polynomial spline approximations, we propose bias-corrected least squares estimators for parameters and establish the asymptotic normality, and show that the estimators of unknown functions achieve optimal nonparametric convergence rate. Moreover, we propose a variable selection procedure to identify significant regressors and derive the oracle property of penalized estimators. Finally, we propose two-stage local polynomial estimation for additive functions and show the corresponding asymptotical normality. Monte carlo studies and real data analysis illustrate the performance of our approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call