Abstract

Methods for obtaining kernel-based density estimators with lower bias and mean integrated squared error than an estimator based on a standard Normal kernel function are described and discussed. Three main approaches are considered which are: firstly by using 'optimal' polynomial kernels as described, for example, by Gasser er a1 (1985); secondly by employing generalised jackknifing as proposed by Jones nd Foster (1993) and thirdly by subtracting an estimator of the principal asymptotic bias term from the original estimator. The emphasis in this initial discussion is on their asymptotic properties. The finite sample performance of those that have the best asymptotic properties are compared with two adaptive estimators, as well as the fixed Normal kernel estimator, in a simulation study.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.