In this paper, we consider a quantile fused LASSO regression model that combines quantile regression loss with the fused LASSO penalty. Intuitively, this model offers robustness to outliers, thanks to the quantile regression, while also effectively recovering sparse and block coefficients through the fused LASSO penalty. To adapt our proposed method for ultrahigh dimensional datasets, we introduce an iterative algorithm based on the multi-block alternating direction method of multipliers (ADMM). Moreover, we demonstrate the global convergence of the algorithm and derive comparable convergence rates. Importantly, our ADMM algorithm can be easily applied to solve various existing fused LASSO models. In terms of theoretical analysis, we establish that the quantile fused LASSO can achieve near oracle properties with a practical penalty parameter, and additionally, it possesses a sure screening property under a wide class of error distributions. The numerical experimental results support our claims, showing that the quantile fused LASSO outperforms existing fused regression models in robustness, particularly under heavy-tailed distributions.