Abstract

In this paper we introduce a smooth version of local linear regression estimators and address their advantages. The MSE and MISE of the estimators are computed explicitly. It turns out that the local linear regression smoothers have nice sampling properties and high minimax efficiency-they are not only efficient in rates but also nearly efficient in constant factors. In the nonparametric regression context, the asymptotic minimax lower bound is developed via the heuristic of the "hardest onedimensional subproblem" of Donoho and Liu. Connections of the minimax risk with the modulus of continuity are made. The lower bound is also applicable for estimating conditional mean (regression) and conditional quantiles for both fixed and random design regression problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call