Abstract

It is well known that the gradient-projection algorithm (GPA) is very useful in solving constrained convex minimization problems. In this paper, we combine a general iterative method with the gradient-projection algorithm to propose a hybrid gradient-projection algorithm and prove that the sequence generated by the hybrid gradient-projection algorithm converges in norm to a minimizer of constrained convex minimization problems which solves a variational inequality.

Highlights

  • Let H be a real Hilbert space and C a nonempty closed and convex subset of H

  • Recall that a contraction on C is a self-mapping of C such that h x − h y ≤ ρ x − y, for all x, y ∈ C, where ρ ∈ 0, 1 is a constant

  • 2 We prove that xn 1 − xn → 0 as n → ∞

Read more

Summary

Introduction

Let H be a real Hilbert space and C a nonempty closed and convex subset of H. Xu proved that under certain appropriate conditions on {αn} and {λn} the sequence {xn} defined by the following relaxed gradient-projection algorithm: xn 1 1 − αn xn αnProjC xn − λn∇f xn , n ≥ 0, 1.5 converges weakly to a minimizer of 1.1 11 . It is proved that if the sequences {θn} and {λn} satisfy appropriate conditions, the sequence {xn} generated by 1.6 converges in norm to a minimizer of 1.1 which solves the variational inequality x∗ ∈ S, I − h x∗, x − x∗ ≥ 0, x ∈ S.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call