Abstract

The skewing method, which has been originally proposed as a bias correction device for local linear regression estimation using standard symmetric kernels, is extended to the cases of asymmetric kernels. The method is defined as a convex combination of three local linear estimators. It is demonstrated that the skewed estimator using asymmetric kernels with properly chosen weights can accelerate the bias convergence from O(b) to O(b2) as b → 0 under sufficient smoothness of the unknown regression curve while not inflating the variance in an order of magnitude, where b is the smoothing parameter and the regressor is assumed to have at least one boundary. As a consequence, the estimator has optimal pointwise convergence of n−4/9 when best implemented, where n is the sample size. It is noteworthy that these properties are the same as those for a local cubic regression estimator. Finite-sample properties of the skewed estimator are assessed in comparison with local linear and local cubic estimators. An application of the skewed estimation to real data is also considered.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call