Abstract

Hybrid steepest descent method is an algorithmic solution to certain hierarchical convex optimization which is a class of two-stage optimization problems: the first stage problem is a convex optimization; the second stage problem is the minimization of a differentiable convex function over the solution set of the first stage problem. In the application of this method, the solution set of the first stage problem must be expressed as the fixed point set of a certain nonexpansive operator. In this paper, we propose a nonexpansive operator that yields a computationally efficient update in cases where it is plugged into the hybrid steepest descent method. The proposed operator is applicable to characterize the solution set of recent sophisticated convex optimization problems, where multiple proximable convex functions involving linear operators must be minimized. To the best of our knowledge, for such a problem, there was not reported any nonexpansive operator that yields an update free from the inversions of linear operators in cases where it is utilized in the hybrid steepest descent method. Unlike conventional operators, the proposed operator yields an inversion-free update.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call