Abstract
Large scale multidimensional data are often available as multiway arrays or higher-order tensors which can be approximately represented in distributed forms via low-rank tensor decompositions and tensor networks. Our particular emphasis is on elucidating that, by virtue of the underlying low-rank approximations, tensor networks have the ability to reduce the dimensionality and alleviate the curse of dimensionality in a number of applied areas, especially in large scale optimization problems and deep learning. We briefly review and provide tensor links between low-rank tensor network decompositions and deep neural networks. We elucidating, through graphical illustrations, that low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volume of data/parameters. Our focus is on the Hierarchical Tucker, tensor train (TT) decompositions and MERA tensor networks in specific applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.