Abstract

In the case of the random design nonparametric regression, locally weighted regression (LWR) in Cleveland is considered. Under some regularity conditions, we show that the regression function estimator derived from LWR has the same asymptotic variance quantity as the k-nearest neighbor (k-NN) smoother. But, it has better asymptotic bias quality than the k-NN smoother. By these results, this regression function estimator has two advantages. One is that it is better than the k-NN smoother in a minimax sense. The other is that it does not suffer from boundary effects when boundary points of the support of the design density are known in advance. The motivation that the regression function estimator derived from LWR has nice asymptotic bias quality and boundary properties comes from Fan and Gijbels and Fan Furthermore, estimators of derivatives of the regression function derived from LWR are considered.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call