Abstract

This chapter reviews the representative accelerated algorithms for deterministic constrained convex optimization. We overview the accelerated penalty method, accelerated Lagrange multiplier method, and the accelerated augmented Lagrange multiplier method. In particular, we concentrate on two widely used algorithms, namely the alternating direction method of multiplier (ADMM) and the primal-dual method. For ADMM, we study four scenarios, namely the generally convex and nonsmooth case, the strongly convex and nonsmooth case, the generally convex and smooth case, and the strongly convex and smooth case. We also introduce its non-ergodic accelerated variant. For the primal-dual method, we study three scenarios: both the two functions are generally convex, both are strongly convex, and one is generally convex, while the other is strongly convex. Finally, we introduce the Frank–Wolfe algorithm under the condition of strongly convex constraint set.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.