Abstract

In this paper, we generalize primal-dual interior-point methods for linear optimization to convex quadratic semidefinite optimization, which is a wide class of optimization problems that contains linear optimization, convex quadratic optimization, second-order cone optimization and semidefinite optimization as special cases. Based on the Nesterov and Todd scaling scheme, we establish the currently best known complexity bounds of large- and small-update interior-point methods for convex quadratic semidefinite optimization, namely, and , respectively, which are as good as the linear optimization analogue.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call