Abstract

This paper is concerned with the study of a class of forward–backward splitting methods based on Lyapunov distance for variational inequalities and convex minimization problem in Banach spaces.

Highlights

  • Let X be a reflexive, strictly convex and smooth Banach space with the dual space X∗, A : X ⇒ X∗ a general maximal monotone operator, and C a closed convex set in X

  • In [3], convergence results have been obtained for the backward–backward splitting method (2) under the key Fenchel conjugate assumption that λnβn Ψ ∗

  • In [18], the authors prove that every sequence generated by a projection iterative method converges strongly to a common minimum norm solution of a variational inequality problem for an inverse strongly monotone mapping in Banach spaces

Read more

Summary

Introduction

Let X be a reflexive, strictly convex and smooth Banach space with the dual space X∗, A : X ⇒ X∗ a general maximal monotone operator, and C a closed convex set in X. In [4], the authors prove that every sequence generated by the forward–backward splitting method converges weakly to a solution of the minimization problem if either the penalization function or the objective function is inf-compact. In [18], the authors prove that every sequence generated by a projection iterative method converges strongly to a common minimum norm solution of a variational inequality problem for an inverse strongly monotone mapping in Banach spaces. It is obvious from the definition of W that x – y 2 ≤ W (x, y) ≤ x + y 2, ∀x, y ∈ X

We also know that
Since λnβn
Proof First observe that
Summation gives
Since the sequence
Proof Since
The following gives the weak ergodic convergence of the sequence
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call