Abstract

This paper presents a novel backtracking strategy for additive Schwarz methods for general convex optimization problems as an acceleration scheme. The proposed backtracking strategy is independent of local solvers, so that it can be applied to any algorithms that can be represented in an abstract framework of additive Schwarz methods. Allowing for adaptive increasing and decreasing of the step size along the iterations, the convergence rate of an algorithm is improved. The improved convergence rate of the algorithm is analyzed rigorously. In addition, combining the proposed backtracking strategy with a momentum acceleration technique, we propose a further accelerated additive Schwarz method. Numerical results for various convex optimization problems such as nonlinear elliptic problems, nonsmooth problems, and nonsharp problems are presented in order to support our theory.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.