Abstract

Tensors, or multi-indexed arrays, play an important role in machine learning and signal processing. These higher-order generalizations of matrices allow for preservation of higher-order structure in data, and low rank decompositions of tensors allow for recovery of underlying information. In many cases, the underlying tensor of interest is positive (semi)definite, i.e., the homogeneous polynomial associated to the tensor has nonnegative evaluation on all inputs. An archetypal problem is that one has a noisy measurement of some low rank signal tensor of interest. This measurement itself does not have low rank, so one must compute a best low rank approximation of the measured tensor to recover component information. As it turns out, the set of tensors of rank less than or equal to is, in general, not closed when , and, as a consequence, best low rank approximations can fail to exist. In the case when a best low rank approximation does not exist, near optimal low rank approximations suffer numerical issues and cannot be used to reliably approximate underlying component information. As a consequence, existence guarantees for best low rank tensor approximations are of great practical and theoretical interest. This article develops deterministic guarantees for the existence of best low rank approximations of tensors which are positive semidefinite. In particular, we show that the set of low rank positive semidefinite tensors is relatively closed as a subset of the set of positive semidefinite tensors. We use this fact to give a deterministic bound which may be used to guarantee the existence of a best low rank approximation of a noisy low rank positive semidefinite tensor. Furthermore, we show that this condition guarantees uniqueness of the canonical polyadic decomposition for best low rank approximations. In addition, for order three tensors, we prove that our bound is sharp and that it can be computed using semidefinite programming. The main results of this article are illustrated through numerical experiments, which show that our bound is highly predictive of numerical issues when attempting to recover underlying component information from a noisy low rank positive semidefinite tensor.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call