Abstract
The projected reflected gradient method has been shown to be a simple and elegant method for solving variational inequalities. The method involves one projection onto the feasible set and one evaluation of the cost operator per iteration and has been shown numerically to be more efficient than most available methods for solving variational inequalities. Convergence results for methods with similar elegant structures of projected reflected gradient method are still rare. In this paper, we present weak and linear convergence of a projected reflected gradient method with an inertial extrapolation step and give some applications arising from optimal control problems. We first obtain weak convergence result for the projected reflected gradient method with an inertial extrapolation step for solving variational inequalities under standard assumptions with self-adaptive step sizes. We further obtain a linear convergence rate when the cost operator is strongly monotone and Lipschitz continuous. Finally, we give some numerical applications arising from optimal control. Preliminary results show that our method is effective and efficient when compared to other related state-of-the-art methods in the literature and show the advantage gained by incorporating inertial terms into the projected reflected gradient methods.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have