Abstract

There have been a number of recent advances in accelerated gradient and proximal schemes for optimization of convex finite sum problems. Defazio introduced a simple accelerated scheme for incremental stochastic proximal algorithms inspired by gradient based methods like SAGA. He was able to prove O(1/k) convergence for non-smooth function but only under the assumption of strong convexity of component terms. We introduce a slight modification of his scheme, called MP-SAGA for which we can prove O(1/k) convergence without strong convexity, but for smooth functions. Numerical results show that our method has better or comparable convergence to Defazio's scheme, even for non-strongly convex functions. As important special cases, we also derive an accelerated schemes for a multi–class formulation of SVM as well as clustering based on the SON regularization. Finally, we introduce a simplification of Point–SAGA, called SP–SAGA for problems such as SON with large number of variables and sparse relation between variables and objective terms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call