Abstract

Abstract We address the problem of smoothing parameter selection for nonparametric curve estimators in the specific context of kernel regression estimation. Call the “optimal bandwidth” the minimizer of the average squared error. We consider several automatically selected bandwidths that approximate the optimum. How far are the automatically selected bandwidths from the optimum? The answer is studied theoretically and through simulations. The theoretical results include a central limit theorem that quantifies the convergence rate and gives the differences asymptotic distribution. The convergence rate turns out to be excruciatingly slow. This is not too disappointing, because this rate is of the same order as the convergence rate of the difference between the minimizers of the average squared error and the mean average squared error. In some simulations by John Rice, the selectors considered here performed quite differently from each other. We anticipated that these differences would be reflected in different asymptotic distributions for the various selectors. It is surprising that all of the selectors have the same limiting normal distribution. To provide insight into the gap between our theoretical results and these simulations, we did a further Monte Carlo study. Our simulations support the theoretical results, and suggest that the differences observed by Rice seemed to be principally due to the choice of a very small error standard deviation and the choice of error criterion. In the example considered here, the asymptotic normality result describes the empirical distribution of the automatically chosen bandwidths quite well, even for small samples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call