Abstract

This paper presents a hierarchical distributed optimization algorithm based on quasi-Newton update steps. Separable convex optimization problems are decoupled through dual decomposition and solved in a distributed fashion by coordinating the solutions of the subproblems through dual variables. The proposed algorithm updates the dual variables by approximating the Hessian of the dual function through collected subgradient information, analogously to quasi-Newton methods. As the dual maximization problem is generally nonsmooth, a smooth approximation might show poor performance. To this end cutting planes, analogous to bundle methods, are constructed that take the nonsmoothness of the dual function into account and lead to a better convergence behavior near the optimum. The proposed algorithm is evaluated on a large set of benchmark problems and compared to the subgradient method and to the bundle trust method for nonsmooth optimization.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.