Abstract

We study extensions of compressive sensing and low rank matrix recovery to the recovery of tensors of low rank from incomplete linear information. While the reconstruction of low rank matrices via nuclear norm minimization is rather well-understand by now, almost no theory is available so far for the extension to higher order tensors due to various theoretical and computational difficulties arising for tensor decompositions. In fact, nuclear norm minimization for matrix recovery is a tractable convex relaxation approach, but the extension of the nuclear norm to tensors is in general NP-hard to compute. In this article, we introduce convex relaxations of the tensor nuclear norm which are computable in polynomial time via semidefinite programming. Our approach is based on theta bodies, a concept from real computational algebraic geometry which is similar to the one of the better known Lasserre relaxations. We introduce polynomial ideals which are generated by the second-order minors corresponding to different matricizations of the tensor (where the tensor entries are treated as variables) such that the nuclear norm ball is the convex hull of the algebraic variety of the ideal. The theta body of order k for such an ideal generates a new norm which we call the θk-norm. We show that in the matrix case, these norms reduce to the standard nuclear norm. For tensors of order three or higher however, we indeed obtain new norms. The sequence of the corresponding unit-θk-norm balls converges asymptotically to the unit tensor nuclear norm ball. By providing the Gröbner basis for the ideals, we explicitly give semidefinite programs for the computation of the θk-norm and for the minimization of the θk-norm under an affine constraint. Finally, numerical experiments for order-three tensor recovery via θ1-norm minimization suggest that our approach successfully reconstructs tensors of low rank from incomplete linear (random) measurements.

Highlights

  • Introduction and motivationCompressive sensing predicts that sparse vectors can be recovered from underdetermined linear measurements via efficient methods such as 1-minimization [10, 20, 23]

  • We study extensions of compressive sensing and low rank matrix recovery to the recovery of tensors of low rank from incomplete linear information

  • We introduce convex relaxations of the tensor nuclear norm which are computable in polynomial time via semidefinite programming

Read more

Summary

Introduction and motivation

Compressive sensing predicts that sparse vectors can be recovered from underdetermined linear measurements via efficient methods such as 1-minimization [10, 20, 23]. Optimal estimates of the required number of measurements are presently available only for tensor recovery approaches that are NP-hard Such a theoretical analysis is still missing for θk-norm minimization, but will be the subject of future work. The computation of the theta basis in turn needs a reduced Grobner basis of the polynomial ideal whose real algebraic variety corresponds to the (canonical) rank one, unit norm tensors. – Due to the fact that the theta norms are built from the polynomial ideal whose real algebraic variety contains all rank-one unit norm tensors, it is a natural question to ask whether the resulting θk-norms coincide with the weighted sum of the nuclear norms of the matricizations. One cannot transfer theoretical results for tensor nuclear norm minimization to θknorm minimization, but one rather requires a direct analysis of our approach which is postponed to future contributions

Low rank matrix recovery
Tensor recovery
Some notation
Structure of the paper
Theta bodies
The matrix case
The tensor θk-norm
Third-order tensors
The theta norm for general d th-order tensors
Convergence of the unit θk-norm balls
Computational complexity
Numerical experiments
The S-polynomial of f and g is the combination xγ xγ
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call