Abstract

The forward–backward algorithm is a splitting method for solving convex minimization problems of the sum of two objective functions. It has a great attention in optimization due to its broad application to many disciplines, such as image and signal processing, optimal control, regression, and classification problems. In this work, we aim to introduce new forward–backward algorithms for solving both unconstrained and constrained convex minimization problems by using linesearch technique. We discuss the convergence under mild conditions that do not depend on the Lipschitz continuity assumption of the gradient. Finally, we provide some applications to solving compressive sensing and image inpainting problems. Numerical results show that the proposed algorithm is more efficient than some algorithms in the literature. We also discuss the optimal choice of parameters in algorithms via numerical experiments.

Highlights

  • 1 Introduction In a real Hilbert space H, the unconstrained minimization problem of the sum of two convex functions is modeled in the following form: min f (x) + g(x), (1.1)

  • We suggest a projected forward–backward algorithm for solving the constrained convex minimization problem modeled as follows: min f (x) + g(x), (1.5)

  • We suggest new forward–backward algorithms to solve the unconstrained and constrained convex minimization problems, which are based on a new linesearch technique [14]

Read more

Summary

Introduction

In a real Hilbert space H, the unconstrained minimization problem of the sum of two convex functions is modeled in the following form: min f (x) + g(x) , (1.1)x∈H where f , g : H → R ∪ {+∞} are proper lower semicontinuous convex functions. It is well known that (1.1) is equivalent to the problem of finding the zero of subdifferentials of f + g at x. Suantai et al Advances in Difference Equations (2021) 2021:265 where α > 0, and proxg is the proximal operator of g defined by proxg = (Id + ∂g)–1, where Id denotes the identity operator in H, and ∂g is the subdifferential of g. In this connection, we can define a simple splitting method xk+1 = proxαkg (Id – αk∇f ) xk , backward step forward step k ≥ 0,

Objectives
Methods
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call