Abstract
Recent achievements in the field of tensor product approximation provide promising new formats for the representation of tensors in form of tree tensor networks. In contrast to the canonical $r$-term representation (CANDECOMP, PARAFAC), these new formats provide stable representations, while the amount of required data is only slightly larger. The tensor train (TT) format [SIAM J. Sci. Comput., 33 (2011), pp. 2295-2317], a simple special case of the hierarchical Tucker format [J. Fourier Anal. Appl., 5 (2009), p. 706], is a useful prototype for practical low-rank tensor representation. In this article, we show how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation. A formulation of the component equations in terms of so-called retraction operators helps to show that many structural properties of the original problems transfer to the micro-iterations, giving what is to our knowledge the first stable generic algorithm for the treatment of optimization tasks in the tensor format. For the examples of linear equations and eigenvalue equations, we derive concrete working equations for the micro-iteration steps; numerical examples confirm the theoretical results concerning the stability of the TT decomposition and of ALS and MALS but also show that in some cases, high TT ranks are required during the iterative approximation of low-rank tensors, showing some potential of improvement.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.