Abstract

Recently, the method called tensor completion by parallel matrix factorization via tensor train (TMac-TT) has achieved promising performance on estimating the missing information. TMac-TT, which borrows $$ket \ augmentation$$ to transform a lower-order tensor into a higher-order tensor, suffers from serious block-artifacts. To tackle this issue, we build an optimization model combining low-rank matrix factorization based on tensor train (TT) rank and the total variation to retain the strength of TT rank and alleviate block-artifacts. We develop a block successive upper-bound minimization algorithm to solve the proposed model. Under some mild conditions, we theoretically prove that the proposed algorithm converges to the coordinatewise minimizers. Extensive numerical experiments illustrate the superiority of the proposed method over several existing state-of-the-art methods qualitatively and quantitatively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call