Abstract

We present an information-theoretic approach to lower bound the oracle complexity of nonsmooth black box convex optimization, unifying previous lower bounding techniques by identifying a combinatorial problem, namely string guessing, as a single source of hardness. As a measure of complexity, we use distributional oracle complexity, which subsumes randomized oracle complexity as well as worst case oracle complexity. We obtain strong lower bounds on distributional oracle complexity for the box [-1, 1] <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</sup> , as well as for the L p-ball for p ≥ 1 (for both low-scale and large-scale regimes), matching worst case upper bounds, and hence we close the gap between distributional complexity, and in particular, randomized complexity and worst case complexity. Furthermore, the bounds remain essentially the same for high-probability and bounded-error oracle complexity, and even for combination of the two, i.e., bounded-error highprobability oracle complexity. This considerably extends the applicability of known bounds.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.