Abstract

Quantile regression is a popular method with a wide range of scientific applications, but the computation for quantile regression is challenging. Hunter and Lange proposed an MM algorithm for solving optimization problems in parametric quantile regression models. For nonparametric and semiparametric quantile regression, their algorithm can be applied to estimate unknown quantile functions in a pointwise manner. However, the resulting estimates may suffer from drawbacks like nonsmooth with discontinuous points and unstable at extreme quantile levels. To remedy the above issues, we propose a new MM algorithm and show that it yields continuous, smoother, and faster estimated quantile functions. We systematically study the new MM algorithm using the local linear quantile regression model. We prove that the proposed algorithm preserves the monotone descent property in an asymptotic sense. We then extend it to some popular nonparametric and semiparametric quantile regression models. For semiparametric models, we propose new efficient backfitting algorithms based on the new MM algorithm. Compared to traditional backfitting algorithms, the new procedures can significantly reduce computational costs for fully iterative backfitting. The performance of the proposed algorithms is demonstrated via extensive simulation studies and a real data example. Supplementary materials for this article are available online.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call