Abstract
In this article, we propose a novel bilayer low-rankness measure and two models based on it to recover a low-rank (LR) tensor. The global low rankness of underlying tensor is first encoded by LR matrix factorizations (MFs) to the all-mode matricizations, which can exploit multiorientational spectral low rankness. Presumably, the factor matrices of all-mode decomposition are LR, since local low-rankness property exists in within-mode correlation. In the decomposed subspace, to describe the refined local LR structures of factor/subspace, a new low-rankness insight of subspace: a double nuclear norm scheme is designed to explore the so-called second-layer low rankness. By simultaneously representing the bilayer low rankness of the all modes of the underlying tensor, the proposed methods aim to model multiorientational correlations for arbitrary N -way ( N ≥ 3 ) tensors. A block successive upper-bound minimization (BSUM) algorithm is designed to solve the optimization problem. Subsequence convergence of our algorithms can be established, and the iterates generated by our algorithms converge to the coordinatewise minimizers in some mild conditions. Experiments on several types of public datasets show that our algorithm can recover a variety of LR tensors from significantly fewer samples than its counterparts.
Accepted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE transactions on neural networks and learning systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.