Abstract

In this chapter we present numerical methods for low-rank matrix and tensor problems that explicitly make use of the geometry of rank constrained matrix and tensor spaces. We focus on two types of problems: The first are optimization problems, like matrix and tensor completion, solving linear systems and eigenvalue problems. Such problems can be solved by numerical optimization for manifolds, called Riemannian optimization methods. We will explain the basic elements of differential geometry in order to apply such methods efficiently to rank constrained matrix and tensor spaces. The second type of problem is ordinary differential equations, defined on matrix and tensor spaces. We show how their solution can be approximated by the dynamical low-rank principle, and discuss several numerical integrators that rely in an essential way on geometric properties that are characteristic to sets of low rank matrices and tensors.

Highlights

  • The following chapter is an outline of Riemannian optimization and integration methods on manifolds of low-rank matrices and tensors

  • It may appear at this point that it is difficult to deal with the tensor train (TT) tensor format computationally, but this is not the case

  • This is due to the structure (9.35) of tangent vectors as sums of TT decompositions that vary in a single core each [50]

Read more

Summary

Introduction

The following chapter is an outline of Riemannian optimization and integration methods on manifolds of low-rank matrices and tensors. 9 Geometric Methods on Low-Rank Matrix and Tensor Manifolds geometry of non-symmetric fixed rank matrices was quite explicitly exploited in numerical algorithms is [59]. It introduced the dynamical low-rank approximation method for calculating low-rank approximations when integrating a matrix that satisfies a set of ordinary differential equations (ODEs), as we will explain in Sect. For optimization problems with rank constraints, several Riemannian optimization methods were first presented in [79, 98, 113] that each use slightly different geometries of the sets fixed rank matrices. Some examples and references for successful application of such methods will be presented in some details later

Aims and Outline
The Geometry of Low-Rank Matrices
Singular Value Decomposition and Low-Rank Approximation
Fixed Rank Manifold
Tangent Space
C11 C1T2 C21 0
Retraction
The Geometry of the Low-Rank Tensor Train Decomposition
The Tensor Train Decomposition
TT-SVD and Quasi Optimal Rank Truncation
Thus the TT-SVD algorithm plays a similar role for TT tensors as the SVD truncation for matrices
Manifold Structure
Tangent Space and Retraction
Elementary Operations and TT Matrix Format
Optimization Problems
Riemannian Optimization
Linear Systems
Computational Cost
Difference to Iterative Thresholding Methods
Convergence
Eigenvalue Problems
Initial Value Problems
Dynamical Low-Rank Approximation
Approximation Properties
Low-Dimensional Evolution Equations
Projector-Splitting Integrator
Applications
Matrix Equations
Schrödinger Equation
Matrix and Tensor Completion
Stochastic and Parametric Equations
Transport Equations
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.