Abstract
An abstract convergence theorem for a class of generalized descent methods that explicitly models relative errors is proved. The convergence theorem generalizes and unifies several recent abstract convergence theorems. It is applicable to possibly nonsmooth and nonconvex lower semicontinuous functions that satisfy the Kurdyka--Łojasiewicz (KL) inequality, which comprises a huge class of problems. Many of the recent algorithms that explicitly prove convergence using the KL inequality can be cast in the abstract framework of this paper and, therefore, the generated sequence converges to a stationary point of the objective function. Additional flexibility compared to related approaches is gained by a descent property that is formulated with respect to a function that is allowed to change along the iterations, a generic distance measure, and an explicit/implicit relative error condition with respect to finite linear combinations of distance terms. As an application of the gained flexibility, the convergence of a block coordinate variable metric version of iPiano (an inertial forward-backward splitting algorithm) is proved, which performs favorably on an inpainting problem with a Mumford--Shah-like regularization from image processing.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.