SummaryWe present Nesterov‐type acceleration techniques for alternating least squares (ALS) methods applied to canonical tensor decomposition. While Nesterov acceleration turns gradient descent into an optimal first‐order method for convex problems by adding a momentum term with a specific weight sequence, a direct application of this method and weight sequence to ALS results in erratic convergence behavior. This is so because ALS is accelerated instead of gradient descent for our nonconvex problem. Instead, we consider various restart mechanisms and suitable choices of momentum weights that enable effective acceleration. Our extensive empirical results show that the Nesterov‐accelerated ALS methods with restart can be dramatically more efficient than the stand‐alone ALS or Nesterov's accelerated gradient methods, when problems are ill‐conditioned or accurate solutions are desired. The resulting methods perform competitively with or superior to existing acceleration methods for ALS, including ALS acceleration by nonlinear conjugate gradient, nonlinear generalized minimal residual method, or limited‐memory Broyden‐Fletcher‐Goldfarb‐Shanno, and additionally enjoy the benefit of being much easier to implement. We also compare with Nesterov‐type updates where the momentum weight is determined by a line search (LS), which are equivalent or closely related to existing LS methods for ALS. On a large and ill‐conditioned 71×1,000×900 tensor consisting of readings from chemical sensors to track hazardous gases, the restarted Nesterov‐ALS method shows desirable robustness properties and outperforms any of the existing methods we compare with by a large factor. There is clear potential for extending our Nesterov‐type acceleration approach to accelerating other optimization algorithms than ALS applied to other nonconvex problems, such as Tucker tensor decomposition.
Read full abstract