Abstract

Tensor recovery and tensor compression, mainly relying on tensor decomposition (TD) techniques, find extensive applications in many visual applications. However, existing TD models neglect the inherent correlation within data modes, and there are problems with rank determination and latent factors arrangement. In this paper, we propose a data-adaptive TD model called adaptive tensor networks (ATN) decomposition, which constructs an optimal topological structure for TD according to the intrinsic properties of the data. Specifically, we leverage a generalized tensor rank to measure the correlation between two data modes, and then establish a multilinear connection among the corresponding latent factors with an adaptive rank. Moreover, ATN possesses the merits of permutation invariance, strong robustness, and less storage cost for representing high-order data. Beyond the above, we propose a unifying structured Schatten norm to bypass the poor local minima of TD-based applications and develop a scalable algorithm with a convergence guarantee that builds upon the linearized alternating direction method (LADM) framework. The effectiveness and superiority of ATN are verified sufficiently on four typical tasks: tensor completion, image denoising, neural network compression, and infrared small target detection. Experiments on synthetic and real datasets show that ATN outperforms state-of-the-art TD methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call