Abstract

This article gives a short overview on proposals for locally adaptive kernel regression estimation. Computational and algorithmic aspects of a new variant and generalization of the iterative plug-in rule by Brockmann, Gasser, and Herrmann are described in detail. This new bandwidth selector adapts to heteroscedasticity and can also be used for nonequidistant design. A simulation study supplements the article. Classical approaches to nonparametric regression estimation use such linear methods as kernel smoothing, orthogonal series, or smoothing spline methods with adaptation of a global smoothing parameter to the data at hand. More refined proposals adapt the smoothing parameters locally. Recently, interest in local methods has increased through research on locally adaptive wavelet methods. These wavelet methods enjoy nearly optimal asymptotic behavior over broad risk and function classes. Nevertheless, classical methods, such as local variable bandwidth kernel estimators, can compete with these new methods, at least in practical terms. This might be due to great experience and refined algorithmic versions which have been developed. (Some theoretical aspects are discussed by Fan, Hall, Martin, and Patil.) Additionally, classical methods can easily be modified for more general regression models without losing their typical structure as is demonstrated by the proposal in this article.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call