Abstract

In this chapter, we will further explore the structure properties for solving optimization problems. We will identify potential bottlenecks for solving these problems and develop new techniques that can skip expensive operations from time to time. More specifically, we first consider a class of composite optimization problems whose objective function is given by the summation of a general smooth and nonsmooth component, and present the gradient sliding (GS) algorithm, which can skip the computation of the gradient for the smooth component from time to time. We then discuss an accelerated gradient sliding (AGS) method for minimizing the summation of two smooth convex functions with different Lipschitz constants and show that the AGS method can skip the gradient computation for one of these smooth components without slowing down the overall optimal rate of convergence. The AGS method can further improve the complexity for solving an important class of bilinear saddle point problems. In addition, we present a new class of decentralized first-order methods for nonsmooth and stochastic optimization problems defined over multiagent networks. These methods can skip the inter-node communications while agents solve the primal subproblems iteratively through linearizations of their local objective functions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.