Abstract

A new decomposition optimization algorithm, called path-following gradient-based decomposition, is proposed to solve separable convex optimization problems. Unlike path-following Newton methods considered in the literature, this algorithm does not require any smoothness assumption on the objective function. This allows us to handle more general classes of problems arising in many real applications than in the path-following Newton methods. The new algorithm is a combination of three techniques, namely smoothing, Lagrangian decomposition and path-following gradient framework. The algorithm decomposes the original problem into smaller subproblems by using dual decomposition and smoothing via self-concordant barriers, updates the dual variables using a path-following gradient method and allows one to solve the subproblems in parallel. Moreover, compared to augmented Lagrangian approaches, our algorithmic parameters are updated automatically without any tuning strategy. We prove the global convergence of the new algorithm and analyze its convergence rate. Then, we modify the proposed algorithm by applying Nesterov's accelerating scheme to get a new variant which has a better convergence rate than the first algorithm. Finally, we present preliminary numerical tests that confirm the theoretical development.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.