Abstract

We propose a method for the approximation of tensors, given either explicitly or implicitly as the solution of tensor linear systems, in the Tucker tensor format. It is an iterative method that greedily constructs a suitable tensor product subspace in which to approximate the tensor by means of successive rank one approximations. An approximation to the target tensor is then obtained by solving a suitable projection of the linear system into the constructed tensor subspace. The proposed method has applications in tensor-structured discretization methods for partial differential equations, such as the Finite Difference Method and Isogeometric Analysis. In several numerical experiments, we compare the method to the greedy rank one update (or Proper Generalized Decomposition) approach, as well as to quasi-optimal approximations obtained by a truncated higher-order singular value decomposition. The proposed method outperforms the rank one update method significantly, both in terms of error per iteration and error per unit of computation time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call