Abstract

Currently, low-rank tensor recovery employing the subspace prior information is an emerging topic, which has attracted considerable attention. However, existing studies cannot flexibly and fully utilize the accessible subspace prior information, thereby leading to suboptimal restored performance. Aiming at addressing this issue, based on the tensor singular value decomposition (t-SVD), this article presents a novel strategy that integrates more than two layers of subspace knowledge about columns and rows of target tensor into one unified recovery framework. Specially, we first design a multilayer subspace prior learning scheme, and then apply it to two common low-rank tensor recovery problems, i.e., tensor completion and tensor robust component principal analysis. Crucially, we prove that our approach can achieve exact recovery of tensors under a significantly weaker incoherence assumption than the analogous conditions previously proposed. Furthermore, two efficient algorithms with convergence guarantees based on alternating direction method of multipliers (ADMM) are proposed to solve the corresponding models. The experimental results on synthetic and real tensor data show that the proposed algorithms outperform other state-of-the-art algorithms in terms of both qualitative and quantitative metrics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call