Abstract
Traditional approaches that represent observations in vector or matrix notation easily lead to structure information losses and dependency destruction between elements. Tensors are the generalizations of matrices, providing a natural representation of rich structure in real-world multiway arrays. Although tensor decomposition has been verified and used in classification tasks, the heterogeneity gap among different orders is often neglected. In this paper, we exploited tensor decomposition framework and presented a heterogeneous tensor decomposition for robust classification (HTDRC) approach, which integrates nuclear and ℓ2,1-norm for intrinsic representation learning. Specifically, to obtain the lowest-rank intrinsic representations of tensorial data from an underlying low-dimensional subspace, HTDRC simultaneously learns a set of orthogonality constrained factor matrices and a low-rank constrained representation matrix. To better guide the decomposition process, a robust discriminant feature selection scheme is utilized by imposing ℓ2,1-norm penalty on the constructed classification loss and the regularized terms simultaneously. Experiments on four datasets demonstrated the superior performance of our proposed approach.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.