Abstract

Low-Rank Tensor Recovery (LRTR), the higher order generalization of Low-Rank Matrix Recovery (LRMR), is especially suitable for analyzing multi-linear data with gross corruptions, outliers and missing values, and it attracts broad attention in the fields of computer vision, machine learning and data mining. This paper considers a generalized model of LRTR and attempts to recover simultaneously the low-rank, the sparse, and the small disturbance components from partial entries of a given data tensor. Specifically, we first describe generalized LRTR as a tensor nuclear norm optimization problem that minimizes a weighted combination of the tensor nuclear norm, the l1-norm and the Frobenius norm under linear constraints. Then, the technique of Alternating Direction Method of Multipliers (ADMM) is employed to solve the proposed minimization problem. Next, we discuss the weak convergence of the proposed iterative algorithm. Finally, experimental results on synthetic and real-world datasets validate the efficiency and effectiveness of the proposed method.

Highlights

  • In the past decade, the low-rank property of some datasets has been explored skillfully to recover both the low-rank and the sparse components or complete the missing entries

  • The experiments are carried out on 50 ˆ 50 ˆ 50 tensors and section, we investigate the model of Generalized Low-Rank Tensor Recovery

  • If we reinforce the constraints we investigate the model of Generalized Low-Rank Tensor Recovery (GLRTR) and develop ank tensor and noise term||

Read more

Summary

Introduction

The low-rank property of some datasets has been explored skillfully to recover both the low-rank and the sparse components or complete the missing entries. Liu et al [13] established a tensor nuclear norm minimization model for tensor completion and proposed the Alternating Direction Method of Multipliers (ADMM) for efficiently solving this model. MRPCA, named robust low-rank tensor recovery [22], was described as a tensor nuclear norm minimization which can be solved efficiently by the ADMM. This paper studies a generalized model of LRTR In this model, the investigated data tensor is assumed to be the superposition of a low-rank component, a gross sparse tensor and a small dense error tensor. The Generalized Low-Rank Tensor Recovery (GLRTR) aims mainly to recover the low-rank and the sparse components from partially observed entries For this purpose, we establish a tensor nuclear norm minimization model for GLRTR.

Notations and Preliminaries
Related Works
Generalized
Model of
Model has the same dimensionality
We also
F To tackle
F Recovery
In Fknown
Tensor
Results
Synthetic
F Liu et up to
Influence
Applications in Background Modeling
Background
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call