Abstract

Linearly constrained convex optimization problems can be reformulated as monotone variational inequalities, which include linearly constrained convex optimization problems as special cases. Usually, deriving and studying optimization methods in the framework of variational inequalities are often more convenient. Based on the twin directions and identical step size, there exists a unified predictor-corrector framework for projection-contraction methods solving monotone variational inequalities. According to the twin directions, the unified framework falls into two classes with similar computational cost in each iteration, while totally the second class outperforms the first as confirmed in numerical experiments. Most recently developed splitting contraction methods belong to a more general framework, in which similar twin directions can be generated. However, until now, all splitting contraction methods can be seen as generalizations of the methods in the first class of the unified predictor-corrector framework. In this paper, the generalization of methods in the second class is derived using existing step size and the other generalized direction. The $O(1/t)$ iteration complexity is proved in the generalized unified framework.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.