Abstract

Tucker decomposition is a standard multi-way generalization of Principal-Component Analysis (PCA), appropriate for processing tensor data. Similar to PCA, Tucker decomposition has been shown to be sensitive against faulty data, due to its L2-norm-based formulation which places squared emphasis to peripheral/outlying entries. In this work, we explore L1-Tucker, an L1-norm based reformulation of Tucker decomposition, and present two algorithms for its solution, namely L1-norm Higher-Order Singular Value Decomposition (L1-HOSVD) and L1-norm Higher-Order Orthogonal Iterations (L1-HOOI). The proposed algorithms are accompanied by complexity and convergence analysis. Our numerical studies on tensor reconstruction and classification corroborate that L1-Tucker decomposition, implemented by means of the proposed algorithms, attains similar performance to standard Tucker when the processed data are corruption-free, while it exhibits sturdy resistance against heavily corrupted entries.

Highlights

  • Tucker tensor decomposition [1]–[3] is a standard method for the analysis and compression of multi-way data

  • Tucker is typically implemented by means of the Higher-Order Singular-Value Decomposition (HOSVD) algorithm, or the Higher-Order Orthogonal Iterations (HOOI) algorithm [2]

  • (1) We present generalized L1-Tucker decomposition for N -way tensors and review its links to Principal-Component Analysis (PCA), Tucker/Tucker2, and L1-norm-based PCA (L1-PCA). (2) We propose two new algorithmic frameworks for the solution of L1-Tucker/L1-Tucker2, namely L1-norm Higher-Order SVD (L1-HOSVD) and L1-norm Higher-Order Orthogonal Iterations (L1-HOOI). (3) We provide complete convergence analysis for L1-HOOI, as well as complexity analysis for L1-HOSVD and L1-HOOI. (4) We present numerical studies on data reconstruction/compression and classification that test the performance of L1-Tucker and compare it with state-of-the-art counterparts

Read more

Summary

INTRODUCTION

Tucker tensor decomposition [1]–[3] is a standard method for the analysis and compression of multi-way (tensor) data. Similar corruption resistance has been recently attained by algorithms for L1-norm-based reformulations of Tucker decomposition, for the special case of 3-way tensors [38]–[41]. Tensor X can be ‘‘matricized’’ by arranging all its mode-n fibers as columns of a matrix This is known as the mode-n unfolding or flattening of X and in this work we denote it by mat(X , n) ∈ RDn×Pn. According to the standard approach, we consider mode-n unfolding such that tensor element X The reverse procedure, known as mode-n ‘‘tensorization’’, rearranges the entries of matrix X ∈ RDn×Pn to form tensor ten(X; n; {Di}i=n) ∈ RD1×D2×...×DN , so that mat(ten(X; n; {Di}i=n), n) = X. For more details on tensor preliminaries, we refer the interested reader to [2], [42]

TUCKER DECOMPOSITION
DATA CORRUPTION
PRIOR WORK
PROPOSED ALGORITHM 1
PROPOSED ALGORITHM 2
CLASSIFICATION
Findings
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.