Abstract

Low-rank tensor completion (LRTC) plays an important role in many fields, such as machine learning, computer vision, image processing, and mathematical theory. Since rank minimization is an NP-hard problem, one strategy is that it is converted into a convex relaxation tensor nuclear norm (TNN) that requires the repeated calculation of time-consuming SVD, and the other is to convert it into some product of two smaller tensors that are easy to fall into the local minimum. In order to overcome the above shortcomings, we propose a robust tensor factorization (RTF) model for solving LRTC. In RTF, the noisy tensor data with missing entries is decomposed into low-rank tensor and noisy tensor, and then the low-rank tensor is equivalently decomposed into t-products (essentially vectors convolution) of two smaller tensors: orthogonal dictionary tensor and low-rank representation tensor. Meanwhile, the TNN of low-rank representation tensor is adopted to characterize the low-rank structure of the tensor data for preserving global information. Then, an effective iterative update algorithm based on the alternating direction method of multipliers (ADMM) is proposed to solve RTF. Finally, numerical experiments on image recovering and video completion tasks show the effectiveness of the proposed RTF model compared with several state-of-the-art tensor completion models.

Highlights

  • Real-world multidimensional data such as images, videos, and social networks usually exhibit low-rank structures due to local similarities, spatial correlations, etc

  • We propose a robust tensor factorization (RTF) model that can effectively overcome the above shortcomings for Low-rank tensor completion (LRTC)

  • COMPARISON METHODS In each dataset, we evaluate the effectiveness of the proposed RTF and the following state-of-the-art low-rank tensor completion methods on color image recovering and grayscale video recovering tasks

Read more

Summary

INTRODUCTION

Real-world multidimensional data such as images, videos, and social networks usually exhibit low-rank structures due to local similarities, spatial correlations, etc. On the basis of further perfecting the TNN theory, Lu et al [14], [22] extend the robust principal component analysis (RPCA) to the tensor RPCA (TRPCA) and apply it to image inpainting These methods complement the unknown entries in the tensor data while utilizing TNN to preserve the low-rank structures. In the RTF model, we first decompose the noisy tensor data with missing entries into low-rank tensor and sparse noise tensor, and low-rank tensor is equivalently decomposed into the t-product of two smaller tensors called dictionary tensors and low-rank representation tensors respectively This equivalent decomposition preserves the global low-rank structure in the tensor data and inherits the property of fast computation of tensor factorization method.

RELATED WORK
CP DECOMPOSITION
DISCRETE FOURIER TRANSFORMATION
T-SVD AND TENSOR NUCLEAR NORM
OPTIMIZATION OF RTF
3: Initialize
CONVERGENCE AND COMPLEXITY
EXPERIMENTS
PARAMETERS SETTING
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call