Abstract

Let H be a real Hilbert space and C be a nonempty closed convex subset of H. Assume that g is a real-valued convex function and the gradient ∇g is frac{1}{L}-ism with L>0. Let 0<lambda <frac{2}{L+2}, 0<beta_{n}<1. We prove that the sequence {x_{n}} generated by the iterative algorithm x_{n+1}=P_{C}(I-lambda(nabla g+beta_{n}I))x_{n}, forall ngeq0 converges strongly to qin U, where q=P_{U}(0) is the minimum-norm solution of the constrained convex minimization problem, which also solves the variational inequality langle-q, p-qrangleleq0, forall pin U. Under suitable conditions, we obtain some strong convergence theorems. As an application, we apply our algorithm to solving the split feasibility problem in Hilbert spaces.

Highlights

  • Let H be a real Hilbert space with inner product ·, · and norm ·

  • We prove that the sequence generated by the iterative algorithm xn+1 = PC(I – λ(∇g + βnI))xn, ∀n ≥ 0 converges strongly to q ∈ U, where q = PU(0) is the minimum-norm solution of the constrained convex minimization problem, which solves the variational inequality –q, p – q ≤ 0, ∀p ∈ U

  • A nonlinear operator T : H → H is nonexpansive if Tx – Ty ≤ x – y for all x, y ∈ H

Read more

Summary

Introduction

Let H be a real Hilbert space with inner product ·, · and norm ·. ), obtained the following iterative algorithm: xn+ = αnγ Vxn + (I – μαnF)Txn, ∀n ≥ , where V is Lipschitzian operator Based on these iterative algorithms, some authors combined GPA with averaged operator to solve the constrained convex minimization problem [ , ]. Yu et al [ ] proposed a strong convergence theorem with a regularized-like method to find an element of the set of solutions for a monotone inclusion problem in a Hilbert space. ): xn+ = PC I – λ(∇g + βnI) xn, ∀n ≥ , converges strongly to a point q ∈ U, where q = PU ( ) is the minimum-norm solution of the constrained convex minimization problem. We give concrete example and the numerical results to illustrate our algorithm is with fast convergence

Preliminaries
Main results that g
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call