Abstract

Higher-order low-rank tensors naturally arise in many applications including hyperspectral data recovery, video inpainting, seismic data recon- struction, and so on. We propose a new model to recover a low-rank tensor by simultaneously performing low-rank matrix factorizations to the all-mode ma- tricizations of the underlying tensor. An alternating minimization algorithm is applied to solve the model, along with two adaptive rank-adjusting strategies when the exact rank is not known. Phase transition plots reveal that our algorithm can recover a variety of synthetic low-rank tensors from significantly fewer samples than the compared methods, which include a matrix completion method applied to tensor recovery and two state-of-the-art tensor completion methods. Further tests on real- world data show similar advantages. Although our model is non-convex, our algorithm performs consistently throughout the tests and give better results than the compared methods, some of which are based on convex models. In addition, the global convergence of our algorithm can be established in the sense that the gradient of Lagrangian function converges to zero.

Highlights

  • Tensor is a generalization of vector and matrix

  • We apply low-rank matrix factorization to each mode unfolding of the tensor in order to enforce low-rankness and update the matrix factors alternatively, which is computationally much cheaper than singular value decomposition (SVD)

  • Non-convexity makes it difficult for us to predict the performance of our approach in a theoretical way, and in general, the performance can vary to the choices of algorithm and starting point

Read more

Summary

Introduction

Tensor is a generalization of vector and matrix. A vector is a first-order ( called one-way or one-mode) tensor, and a matrix is a second-order tensor. Matrix completion method for recovering low-rank tensors: we unfold the underlying N -way tensor M along its N th mode and apply LMaFit [27] to (3), where Z corresponds to unfoldN (M). The “square” model (7) solved by LMaFit gives much better results but still worse than those given by TMac. In addition, Figure 3 depicts the phase transition plots of TMac utilizing 1, 2, 3, and 4 modes of matricization on the 4-way dataset. We update Ynk+1 to Q and Xkn+1 ← [Xkn+1, 0], where 0 is an In × ∆rn zero matrix4 This scheme works well for exactly low-rank tensors and for approximately low-rank ones.

TMac with fixed parameters
We set
Findings
Fail Succeed
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call