Abstract

Unlike the flat-view matrix, tensor provides an elegant representation to preserve the structure of data in the multidimensional nature. The wide use of tensor-based approaches highlights the limitation of matrix-based methods and has generated increasing needs in multidimensional data analytics. Driven by such demands, tensor decomposition with automatic rank determination has recently emerged as a promising tool for image processing, wireless communications and neural network compression. However, unlike the success in matrix-based schemes of exploring the correlation for performance improvement, existing tensor-based methods fail to consider the correlation structure in the tensor. In this paper, we propose to exploit the tensor intra-dimension correlation within the framework of sparse Bayesian learning. We provide the sparsity-inducing probabilistic models under both canonical polyadic decomposition and Tucker decomposition to capture the correlation structure effectively. We derive an efficient model inference method with a fast convergence rate based on the expectation-maximization algorithm. Furthermore, a complexity reduction method is proposed to speed up the computation by exploiting several properties of the Kronecker product. We further extend the proposed framework for the incomplete tensor corrupted by sparse outliers to achieve effective tensor completion and outlier mitigation. Simulation results demonstrate the superior performance of the proposed framework over those ignoring the intra-dimension correlation in terms of accuracy and sparsity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call