Abstract

The Frank Wolfe algorithm (FW) is a popular projection-free alternative for solving large-scale constrained optimization problems. However, the FW algorithm suffers from a sublinear convergence rate when minimizing a smooth convex function over a compact convex set. Thus, exploring techniques that yield a faster convergence rate becomes crucial. A classic approach to obtain faster rates is to combine previous iterates to obtain the next iterate. In this work, we extend this approach to the FW setting and show that the optimal way to combine the past iterates is using a set of orthogonal Jacobi polynomials. We also propose a polynomial-based acceleration technique, referred to as Jacobi polynomial accelerated FW, which combines the current iterate with the past iterate using combing weights related to the Jacobi recursion. By carefully choosing parameters of the Jacobi polynomials, we obtain a faster sublinear convergence rate. We provide numerical experiments on real datasets to demonstrate the efficacy of the proposed algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call