Abstract

In this paper, we consider a quantile fused LASSO regression model that combines quantile regression loss with the fused LASSO penalty. Intuitively, this model offers robustness to outliers, thanks to the quantile regression, while also effectively recovering sparse and block coefficients through the fused LASSO penalty. To adapt our proposed method for ultrahigh dimensional datasets, we introduce an iterative algorithm based on the multi-block alternating direction method of multipliers (ADMM). Moreover, we demonstrate the global convergence of the algorithm and derive comparable convergence rates. Importantly, our ADMM algorithm can be easily applied to solve various existing fused LASSO models. In terms of theoretical analysis, we establish that the quantile fused LASSO can achieve near oracle properties with a practical penalty parameter, and additionally, it possesses a sure screening property under a wide class of error distributions. The numerical experimental results support our claims, showing that the quantile fused LASSO outperforms existing fused regression models in robustness, particularly under heavy-tailed distributions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.