Abstract

Robust principal component analysis (RPCA) has been widely used for many data analysis problems in matrix data. Robust tensor principal component analysis (RTPCA) aims to extract the low rank and sparse components of multidimensional data, which is a generation of RPCA. The current RTPCA methods are directly based on tensor singular value decomposition (t-SVD), which is a new tensor decomposition method similar to singular value decomposition (SVD) in matrices. These methods focus on utilizing different sparse constraints for real applications and make less analysis for tensor nuclear norm (TNN) defined in t-SVD. However, we find low-rank structure still exists in the core tensor and existing methods can not fully extract the low-rank structure of tensor data. To further exploit the low-rank structures in multiway data, we extract low-rank component for the core matrix whose entries are from the diagonal elements of the core tensor. Based on this idea, we have defined a new TNN that extends TNN with core matrix and propose a creative algorithm to deal with RTPCA problems. The results of numerical experiments show that the proposed method outperforms state-of-the-art methods in terms of both accuracy and computational complexity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call