Abstract
The symmetric alternating direction method of multipliers is an efficient algorithm, which updates the Lagrange multiplier twice at each iteration and the variables are treated in a symmetric manner. Considering that the convergence range of the parameters plays an important role in the implementation of the algorithm. In this paper, we analyze the convergence rate of the symmetric ADMM with a more relaxed parameter range for solving the two block nonconvex separable optimization problem under the assumption that the generated sequence is bounded. Two cases are considered. In the first case, both components of the objective function are nonconvex, we prove the convergence of the augmented Lagrangian function sequence, and establish the $ O(1/\sqrt{k}) $ worst-case complexity measured by the difference of two consecutive iterations. In the second case, one component of the objective function is convex and the error bound condition is assumed, then we can prove that the iterative sequence converges locally to a KKT point in a R-linear rate; and an auxiliary sequence converges in a Q-linear rate. Furthermore, a practical inexact symmetric ADMM with relative error criteria is proposed, and the associated convergence analysis is established under the same conditions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Industrial & Management Optimization
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.