Tensor recovery is a fundamental problem in tensor research field. It generally requires to explore intrinsic prior structures underlying tensor data, and formulate them as certain forms of regularization terms for guiding a sound estimate of the restored tensor. Recent researches have made significant progress by adopting two insightful tensor priors, i.e., global low-rankness (L) and local smoothness (S), which are always encoded as a sum of two separate regularizers into recovery models. However, unlike the primary theoretical developments on low-rank tensor recovery, these joint "L+S" models have no theoretical exact-recovery guarantees yet, making the methods lack reliability in real practice. To this crucial issue, in this work, we build a unique regularizer termed as tensor correlated total variation (t-CTV), which essentially encodes both L and S priors of a tensor simultaneously. Especially, by equipping t-CTV into the recovery models, we can rigorously prove the exact recovery guarantees for two typical tensor recovery tasks, i.e., tensor completion and tensor robust principal component analysis. To the best of our knowledge, this should be the first exact-recovery results among all related "L+S" methods for tensor recovery. We further propose ADMM algorithms with fine convergence to solve the proposed models. Significant recovery accuracy improvements are observed in extensive experiments. Typically, our method achieves a workable performance when the missing rate is extremely large, e.g., 99.5%, for the color image inpainting task, while all its peers totally fail in such a challenging case. Code is released at https://github.com/wanghailin97.
Read full abstract