Abstract

Solving large-scale non-convex optimization problems is the fundamental challenge in the development of matrix factorization (MF)-based recommender systems. Unfortunately, employing conventional first-order optimization approaches proves to be an arduous endeavor since their curves are very complex. The exploration of second-order optimization methods holds great promise. They are more powerful because they consider the curvature of the optimization problem, which is captured by the second-order derivatives of the objective function. However, a significant obstacle arises when directly applying Hessian-based approaches: their computational demands are often prohibitively high. Therefore, the authors propose AdaGO, a novel quasi-Newton method-based optimizer to meet the specific requirements of large-scale non-convex optimization problems. AdaGO can strike a balance between computational efficiency and optimization performance. In the comparative studies with state-of-the-art MF-based models, AdaGO demonstrates its superiority by achieving higher prediction accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call