Abstract
As is known, the regularization method plays an important role in solving constrained convex minimization problems. Based on the idea of regularization, implicit and explicit iterative algorithms are proposed in this paper and the sequences generated by the algorithms can converge strongly to a solution of the constrained convex minimization problem, which also solves a certain variational inequality. As an application, we also apply the algorithm to solve the split feasibility problem.
Highlights
Assume that H is a Hilbert space with inner product ⟨⋅⟩ and norm ‖ ⋅ ‖ induced by its inner product
Let C be a closed and convex subset of a Hilbert space H and let T : C → C be a nonexpansive mapping with Fix(T) converging to x and if
Norm ‖ ⋅ ‖ induced by its inner product
Summary
Assume that H is a Hilbert space with inner product ⟨⋅⟩ and norm ‖ ⋅ ‖ induced by its inner product. We know that the gradient-projection algorithm can be used to solve the constrained convex minimization problem. Abstract and Applied Analysis algorithm; he constructed a counterexample to prove that algorithm (6) has weak convergence only, in infinitedimensional space He provided two modifications to ensure that the gradient-projection algorithms can converge strongly to a solution of (5). More investigations about the gradient-projection algorithm and its important role in solving the constrained convex minimization problem can be seen in [2–11]. On the gradient-projection method based on the regularization, we have weak convergence result as follows: define a sequence by xn+1 := ProjC (I − γ∇fαn ) (xn) , n ≥ 0,. The sequence generated by the constructed algorithm can converge strongly to a minimizer of the constrained convex minimization problem. We apply the constructed algorithm to solve a split feasibility problem
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have