Abstract

Total variation image denoising models have received considerable attention in the last two decades. To solve constrained total variation image denoising problems, we utilize the computation of a resolvent operator, which consists of a maximal monotone operator and a composite operator. More precisely, the composite operator consists of a maximal monotone operator and a bounded linear operator. Based on recent work, in this paper we propose a fixed-point approach for computing this resolvent operator. Under mild conditions on the iterative parameters, we prove strong convergence of the iterative sequence, which is based on the classical Krasnoselskii–Mann algorithm in general Hilbert spaces. As a direct application, we obtain an effective iterative algorithm for solving the proximity operator of the sum of two convex functions, one of which is the composition of a convex function with a linear transformation. Numerical experiments on image denoising are presented to illustrate the efficiency and effectiveness of the proposed iterative algorithm. In particular, we report the numerical results for the proposed algorithm with different step sizes and relaxation parameters.

Highlights

  • In the last two decades, the total variation (TV) image denoising model proposed by Rudin, Osher, and Fatemi [1] has received considerable attention

  • Because TV regularization has the advantage of maintaining image edges when removing noise, it has been extended to many other important image processing problems, including image deblurring [2,3,4], image inpainting [5], image superresolution [6], image segmentation [7], and image reconstruction [8, 9]

  • It is worth mentioning that TV includes Isotropic total variation (ITV) and anisotropic total variation (ATV), which can both be viewed as compositions of a convex function φ with a linear transformation B

Read more

Summary

Introduction

In the last two decades, the total variation (TV) image denoising model proposed by Rudin, Osher, and Fatemi [1] has received considerable attention. Often referred to as the ROF model, this takes the form min x∈Rn ‖x − u‖2 + μ ‖x‖TV (1). Many efficient iterative algorithms have been proposed to solve the ROF model (1). These include the Chambolle gradient projection algorithm and its variants [10,11,12], the primal-dual hybrid gradient algorithm [13,14,15], and the split Bregman algorithm [16, 17]. Micchelli et al [20] extended the ROF model (1) to the general convex optimization problem min x∈Rn φ

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call