Low-rank tensor regression (LRTR) problems are widely studied in statistics and machine learning, in which the regressors are generally grouped by clustering strongly correlated variables or variables corresponding to different levels of the same predictive factor in many practical applications. By virtue of the idea of group selection in the classical linear regression framework, we propose an LRTR method for adaptive selection of grouped variables in this article, which is formulated as a group SLOPE penalized low-rank, orthogonally decomposable tensor optimization problem. Moreover, we introduce the notion of tensor group false discovery rate (TgFDR) to measure the group selection performance. The proposed regression method provably controls TgFDR and achieves the asymptotically minimax estimate under the assumption that variable groups are orthogonal to each other. Finally, an alternating minimization algorithm is developed for efficient problem resolution. We demonstrate the performance of our proposed method in group selection and low-rank estimation through simulation studies and real dataset analysis.
Read full abstract