Abstract

SummaryWe derive nonlinear acceleration methods based on the limited‐memory Broyden–Fletcher–Goldfarb–Shanno (L‐BFGS) update formula for accelerating iterative optimization methods of alternating least squares (ALS) type applied to canonical polyadic and Tucker tensor decompositions. Our approach starts from linear preconditioning ideas that use linear transformations encoded by matrix multiplications and extends these ideas to the case of genuinely nonlinear preconditioning, where the preconditioning operation involves fully nonlinear transformations. As such, the ALS‐type iterations are used as fully nonlinear preconditioners for L‐BFGS, or equivalently, L‐BFGS is used as a nonlinear accelerator for ALS. Numerical results show that the resulting methods perform much better than either stand‐alone L‐BFGS or stand‐alone ALS, offering substantial improvements in terms of time to solution and robustness over state‐of‐the‐art methods for large and noisy tensor problems, including previously described acceleration methods based on nonlinear conjugate gradients and the nonlinear generalized minimal residual method. Our approach provides a general L‐BFGS‐based acceleration mechanism for nonlinear optimization.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.