Abstract

– Data sharpening in kernel regression has been shown to be an effective method of reducing bias while having minimal effects on variance. Earlier efforts to iterate the data sharpening procedure have been less effective, due to the employment of an inappropriate sharpening transformation. In the present paper, an iterated data sharpening algorithm is proposed which reduces the asymptotic bias at each iteration, while having modest effects on the variance. The efficacy of the iterative approach is demonstrated theoretically and via a simulation study. Boundary effects persist and the affected region successively grows when the iteration is applied to local constant regression. By contrast, boundary bias successively decreases for each iteration step when applied to local linear regression. This study also shows that after iteration, the resulting estimates are less sensitive to bandwidth choice, and a further simulation study demonstrates that iterated data sharpening with data-driven bandwidth selection via cross-validation can lead to more accurate regression function estimation. Examples with real data are used to illustrate the scope of change made possible by using iterated data sharpening and to also identify its limitations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call