Abstract
We propose a method for the approximation of tensors, given either explicitly or implicitly as the solution of tensor linear systems, in the Tucker tensor format. It is an iterative method that greedily constructs a suitable tensor product subspace in which to approximate the tensor by means of successive rank one approximations. An approximation to the target tensor is then obtained by solving a suitable projection of the linear system into the constructed tensor subspace. The proposed method has applications in tensor-structured discretization methods for partial differential equations, such as the Finite Difference Method and Isogeometric Analysis. In several numerical experiments, we compare the method to the greedy rank one update (or Proper Generalized Decomposition) approach, as well as to quasi-optimal approximations obtained by a truncated higher-order singular value decomposition. The proposed method outperforms the rank one update method significantly, both in terms of error per iteration and error per unit of computation time.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.