Variational models for image deblurring problems typically consist of a smooth term and a potentially non-smooth convex term. A common approach to solving these problems is using proximal gradient methods. To accelerate the convergence of these first-order iterative algorithms, strategies such as variable metric methods have been introduced in the literature. In this paper, we prove that, for image deblurring problems, the variable metric strategy proposed in Aleotti et al. (Comput. Optim. Appl., 2024) can be reinterpreted as a right preconditioning method. Consequently, we explore an inexact left-preconditioned version of the same proximal gradient method. We prove the convergence of the new iteration to the minimum of a variational model where the norm of the data fidelity term depends on the preconditioner. The numerical results show that left and right preconditioning are comparable in terms of the number of iterations required to reach a prescribed tolerance, but left preconditioning needs much less CPU time, as it involves fewer evaluations of the preconditioner matrix compared to right preconditioning. The quality of the computed solutions with left and right preconditioning are comparable. Finally, we propose some non-stationary sequences of preconditioners that allow for fast and stable convergence to the solution of the variational problem with the classical ℓ2–norm on the fidelity term.
Read full abstract- All Solutions
Editage
One platform for all researcher needs
Paperpal
AI-powered academic writing assistant
R Discovery
Your #1 AI companion for literature search
Mind the Graph
AI tool for graphics, illustrations, and artwork
Unlock unlimited use of all AI tools with the Editage Plus membership.
Explore Editage Plus - Support
Overview
10121 Articles
Published in last 50 years
Articles published on Variational Problem
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
10549 Search results
Sort by Recency