Abstract

Sparse quantile regression is a useful tool for variable selection, robust estimation and heteroscedasticity treatment in high-dimensional data analysis. Due to the non-smooth of quantile loss, the computation is heavier than the least square models. In the literature, there are various numerical methods for linear quantile regression, such as ADMM developed in Gu et al. [Technometrics. 2017;60(3):319–331]. However, the computation of multivariate quantile regression has not yet been fully resolved, especially when the dimension is high. Motivated by this, we will focus on the design of fast numerical algorithms for row-sparse multivariate quantile regression model. By virtue of proximal operator and Majorize–Minimization, four smoothed algorithms are designed. For all the obtained algorithms, we analyse their convergence and the parameter selection. We conduct plenty of simulations and four real data sets analysis. Finally, we conclude that the smoothed method is faster than the non-smooth method, especially when the number of predictors is large.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call