Abstract

Online convex optimization (OCO) with switching costs is a key enabler for cloud resource provision, online portfolio optimization, and many other applications. Surprisingly, very little theoretical understanding is known. In this study, we investigate OCO with squared l2 norm switching cost (OCOl2SC) for three kinds of loss functions: (a) generally convex, (b) convex and smooth, and (c) strongly convex and smooth. We design customized gradient descent algorithms for OCOl2SC in these three cases: specifically, SOGD (smoothed online gradient descent) for generally convex loss functions, OOMD (online optimistic mirror descent) for convex and smooth functions, and OMGD (online multiple gradient descent) for strongly convex and smooth loss functions. We theoretically analyze the three algorithms’ dynamic regrets and their upper bounds. By showing that the dynamic regrets match their lower bounds, we conclude that the three algorithms achieve the order optimal or near-optimal dynamic regret bounds in the corresponding case. Numerical studies further verify the remarkable performance of the three algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.