Abstract

The regularization method could deal with the swamp effect of alternating least squares (ALS) algorithms for tensor decomposition. Usually, the regularization term is a norm of the difference between the solution and the current iterate. In this paper, we show that the norm could be weakened to a seminorm, so the selection of the regularization term could be more flexible. To overcome the swamp effect and avoid the drawback that the Hessian of the subproblem may get close to singular in the iterative process, we propose a seminorm regularized ALS algorithm for solving the canonical tensor decomposition. Moreover, in the new algorithm, we introduce a novel extrapolation in the update of each mode factor which makes an immediate impression on the update of subsequent ones. By assuming the boundness of the infinite sequence of iterates generated by the new algorithm, we establish the global convergence and the (weakly) linear convergence rate of the sequence of iterates Numerical experiments on synthetic and real-world problems illustrate that the new method is efficient and promising.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call