Abstract

More complex data are generated with a response on vector and matrix predictors in statistics and machine learning. Recently, Zhou and Li (2014) proposed matrix regression based on least squares (LS) method but they mainly considered the regularized matrix regression with nuclear norm penalty when the distribution of noise is with mean 0 and covariance being fixed. In practice, noises may be heavy-tailed or the distribution is unknown. In this case, it is well known that least absolute deviation (LAD) method yields better performances than LS method. Considering structures of predictors, we propose the double fused Lasso penalized LAD for matrix regression in this paper. The new penalty term combines fused Lasso and matrix-type fused Lasso. We achieve the strong duality theorem between the double fused Lasso penalized LAD and its dual. Based on it, we design a highly scalable symmetric Gauss–Seidel based Alternating Direction Method of Multipliers (sGS-ADMM) algorithm to solve the dual problem. Moreover, we give the global convergence and Q-linear rate of convergence. Finally, effectiveness of our method is demonstrated by numerical experiments on simulation and real datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call