Abstract

We suggest and analyze a modified extragradient method for solving variational inequalities, which is convergent strongly to the minimum-norm solution of some variational inequality in an infinite-dimensional Hilbert space.

Highlights

  • Let C be a closed convex subset of a real Hilbert space H

  • It is well known that the convergence of Algorithm 1.1 requires that the operator A must be both strongly monotone and Lipschitz continuous

  • These restrict conditions rules out its applications in several important problems

Read more

Summary

Introduction

Let C be a closed convex subset of a real Hilbert space H. It is well known that variational inequalities are equivalent to the fixed point problem This alternative formulation has been used to study the existence of a solution of the variational inequality as well as to develop several numerical methods. It is well known that the convergence of Algorithm 1.1 requires that the operator A must be both strongly monotone and Lipschitz continuous These restrict conditions rules out its applications in several important problems. We suggest and consider a very simple modified extragradient method which is convergent strongly to the minimum-norm solution of variational inequality 1.2 in an infinite-dimensional Hilbert space. This new method includes the method of Noor 2 as a special case

Preliminaries
Main Result
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call