Abstract

In this paper, we investigate the finite sample performance of four kernel-based estimators that are currently available for additive non-parametric regression models—the classic backfitting estimator (CBE), the smooth backfitting estimator, the marginal integration estimator, and two versions of a two-stage estimator of which the first is proposed by Kim, Linton and Hengartner (1999) and the second is proposed in this paper. The bandwidths are selected for each estimator by minimizing their respective asymptotic approximation of the mean average squared errors. In our simulations, we are particularly concerned with the performance of these estimators under this unified data-driven bandwidth selection method, since in this case both the asymptotic and the finite sample properties of all estimators are currently unavailable. The comparison is based on the estimators' average squared error. Our Monte Carlo results seem to suggest that the CBE is the best performing kernel-based procedure.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call