Abstract

Abstract There exists a variety of nonparametric regression estimators, with cubic smoothing splines, k-nearest-neighbor (k-NN) estimators, and various types of kernel estimators among the most popular. A class of kernel estimators with local bandwidth depending on the density of the design points is introduced, where the degree of design adaptation may be expressed by a single parameter α in [0, 1]. The adaptation to the design is such that the bandwidth is made larger, to a degree depending on α, where the design is thin. Special values of this parameter correspond approximately to the ordinary (fixed-width) kernel estimator, the smoothing spline, and the k-NN estimator. Hence this method offers a synthesis of some classical methods. The same method allows estimation of derivatives by using appropriate kernels. The influence of this degree of design adaptation on the integrated mean squared error is investigated. There is no uniformly optimal solution. The optimal solution depends in a complex way on design and underlying regression function, but there exists a minimax optimal solution given as the fixed-width kernel smoother. In an empirical investigation, the fixed-width smoother or some mildly design-adaptive estimators performed well. Essentially, two different weighting schemes for kernel estimation have been introduced in the literature, one traditionally associated with the fixed-design model (the Priestley—Chao—type estimator) and one with the random-design model (the Nadaraya—Watson—type estimator). The derivation of bias and variance for random design for the former allows a more extensive comparison of the two types of kernel estimators.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call