Abstract
The tensor train decomposition (TTD) has become an attractive decomposition approach due to its ease of inference by use of the singular value decomposition and flexible yet compact representations enabling efficient computations and reduced memory usage using the TTD representation for further analyses. Unfortunately, the level of complexity to use and the order in which modes should be decomposed using the TTD is unclear. We advance TTD to a fully probabilistic TTD (PTTD) using variational Bayesian inference to account for parameter uncertainty and noise. In particular, we exploit that the PTTD enables model comparisons by use of the evidence lower bound (ELBO) of the variational approximation. On synthetic data with ground truth structure and a real 3-way fluorescence spectroscopy dataset, we demonstrate how the ELBO admits quantification of model specification not only in terms of numbers of components for each factor in the TTD, but also a suitable order of the modes in which the TTD should be employed. The proposed PTTD provides a principled framework for the characterization of model uncertainty, complexity, and model- and mode-order when compressing tensor data using the TTD.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.