Abstract

In this work, we consider the problem of factoring a third-order tensor into multilinear rank -($L_{r}, L_{r}$, 1) terms. This model, referred to as the rank-($L_{r}, L_{r}$, 1) block-term decomposition (BTD), finds many applications in signal processing, especially blind separation of smooth sources and unmixing spectral-spatial data (e.g., hyper-spectral image). On the other hand, finding latent factors of rank-($L_{r}, L_{r}$, 1) BTD poses a very challenging optimization problem. Some computational tools designed for canonical polyadic decomposition (CPD) (e.g., alternating least squares (ALS) and Levenberg-Marquardt (LM) based algorithms) can be modified to handle rank-($L_{r}, L_{r}$, 1) BTD. Nonetheless, these methods essentially treat rank-($L_{r}, L_{r}$, 1) BTD as a special CPD problem. This raises a number of challenges, since rank -($L_{r}, L_{r}$, 1) BTD can be viewed as a CPD problem with rank-deficient latent factors and high CP rank–and these are known as hard cases for CPD algorithms. In this work, we reformulate the rank-($L_{r}, L_{r}$, 1) BTD problem as a matrix rank-constrained matrix factorization problem. We propose a simple algorithm that combines alternating optimization and projected gradient. This way, the per-iteration complexity is much smaller than those of the ALS and LM-based algorithms. We also show that the algorithm converges to a stationary point of the problem of interest at a sublinear rate–although nonconvex constraints are involved. Numerical experiments are conducted to showcase the effectiveness of our algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call