Abstract

This chapter provides a selective overview of nonconvex penalized quantile regression in high dimension. Quantile regression is a widely recognized useful alternative to the classical least-squares regression. The most prominent feature of quantile regression is its ability to incorporate heterogeneity, which can arise from heteroskedastic variances or other sources beyond the commonly used location–scale models. Quantile regression allows the covariates to influence the location, dispersion, and other aspects of the conditional distribution. Computationally, quantile regression can be formulated as a convex optimization problem where the objective function has the form of asymmetrically weighted absolute values of residuals. Quantile regression enjoys several other appealing properties. It is naturally robust to outliers in the response space. The chapter reviews nonconvex penalized linear quantile regression in ultra-high dimension. Semiparametric quantile regression is important for high-dimensional data analysis for several reasons. The chapter summarizes the large-sample properties of the oracle estimator and the penalized quantile regression estimator.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.