Abstract

Tensor network (TN) has demonstrated remarkable efficacy in the compact representation of high-order data. In contrast to the TN methods with pre-determined structures, the recently introduced tensor network structure search (TNSS) methods automatically learn a compact TN structure from the data, gaining increasing attention. Nonetheless, TNSS requires time-consuming manual adjustments of the penalty parameters that control the model complexity to achieve better performance, especially in the presence of missing or noisy data. To provide an effective solution to this problem, in this paper, we propose a parameters tuning-free TNSS algorithm based on Bayesian modeling, aiming at conducting TNSS in a fully data-driven manner. Specifically, the uncertainty in the data corruption is well-incorporated in the prior setting of the probabilistic model. For TN structure determination, we reframe it as a rank learning problem of the fully-connected tensor network (FCTN), integrating the generalized inverse Gaussian (GIG) distribution for low-rank promotion. To eliminate the need for hyperparameter tuning, we adopt a fully Bayesian approach and propose an efficient Markov chain Monte Carlo (MCMC) algorithm for posterior distribution sampling. Compared with the previous TNSS method, experiment results demonstrate the proposed algorithm can effectively and efficiently find the latent TN structures of the data under various missing and noise conditions and achieves the best recovery results. Furthermore, our method exhibits superior performance in tensor completion with real-world data compared to other state-of-the-art tensor-decomposition-based completion methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call