We propose a new approach to conditional quantile function estimation that combines both parametric and nonparametric techniques. At each design point, a global, possibly incorrect, pilot parametric model is locally adjusted through a kernel smoothing fit. The resulting quantile regression estimator behaves like a parametric estimator when the latter is correct and converges to the nonparametric solution as the parametric start deviates from the true underlying model. We give a Bahadur-type representation of the proposed estimator from which consistency and asymptotic normality are derived under an α -mixing assumption. We also propose a practical bandwidth selector based on the plug-in principle and discuss the numerical implementation of the new estimator. Finally, we investigate the performance of the proposed method via simulations and illustrate the methodology with a data example.
Read full abstract