This paper presents a class of proximal alternating direction multiplier methods for solving nonconvex, nonsmooth optimization problems with nonlinear coupled constraints. The key feature of the proposed algorithm is that we use a linearized proximal technique to update the primary variables, followed by updating the dual variables using a discounting approach. This approach eliminates the requirement for an additional proxy function and then simplifies the optimization process. In addition, the algorithm maintains fixed parameter selection throughout the update process, removing the requirement to adjust parameters to ensure the decreasing nature of the generated sequence. Building on this framework, we establish a Lyapunov function with sufficient decrease and a lower bound, which is essential for analyzing the convergence properties of the algorithm. We rigorously prove both the subsequence convergence and the global convergence of the algorithm and ensure its robustness and effectiveness in solving complex optimization problems. Our paper provides a solid theoretical foundation for the practical application of this method in solving nonconvex optimization problems with nonlinear coupled constraints.
Read full abstract