Abstract

An optimization-based approach for Tucker tensor approximation of parameter-dependent data tensors and solutions of tensor differential equations with low Tucker rank is presented. The problem of updating the tensor decomposition is reformulated as a fitting problem subject to the tangent space without relying on an orthogonality gauge condition. A discrete Euler scheme is established in an alternating least squares framework, where the quadratic subproblems are reduced to trace optimization problems, that are shown to be explicitly solvable and accessible using SVD of small size. In the presence of small singular values, instability for larger ranks is reduced, since the method does not need the (pseudo) inverse of matricizations of the core tensor. Regularization of Tikhonov type can be used to compensate for the lack of uniqueness in the tangent space. The method is validated numerically and shown to be stable also for larger ranks in the case of small singular values of the core unfoldings. Higher order explicit integrators of Runge-Kutta type can be composed.

Highlights

  • Low-rank matrix and tensor approximation occurs in a variety of applications as model reduction approach [7]

  • We have not observed any related problems in the numerical tests, we found that including a regularization of Tikhonov type helps to stabilize the singular value decomposition (SVD) in cases where Bn is effectively rank deficient and where CpnqCTpnq is numerically close to singular, respectively

  • Alg. 1 was used with a tolerance of 10 ́5 for the change in the relative defect }A9 ptq Y9 ptq}{}A9 ptq} (cf. (36)), which was basically reached after two loops

Read more

Summary

Introduction

Low-rank matrix and tensor approximation occurs in a variety of applications as model reduction approach [7]. In a discretetime setting for tensor streams, incremental fitting procedures for dimensionality reduction were introduced as dynamic higher order generalizations of principal component analysis [25] While this approach seeks to minimize a reconstruction error by a sequence of Tucker tensor approximations sharing same factors and is discrete in time, a different time continuous setting was recently discussed by Koch and Lubich [12, 13], where differential equations for the factor matrices and core tensor of the Tucker approximation were derived. Problem (2) yields an initial value problem on Mr in form of a system of nonlinear differential equations for the factors and core tensor in the Tucker decomposition This system is explicit if an orthogonality gauge condition in TYMr is imposed. We validate our approach by means of numerical examples

Prerequisites
Notation
Related approach of Koch and Lubich
Dynamical approximation without orthogonality constraint in the tangent space
Euler method under orthogonal columns constraint
Algorithm for Euler step via ALS
Regularization
Numerical validation
Second Order Method
Small singular values in higher dimensions
Spatially discretized PDEs
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call