Abstract

We lower bound the complexity of finding \(\epsilon \)-stationary points (with gradient norm at most \(\epsilon \)) using stochastic first-order methods. In a well-studied model where algorithms access smooth, potentially non-convex functions through queries to an unbiased stochastic gradient oracle with bounded variance, we prove that (in the worst case) any algorithm requires at least \(\epsilon ^{-4}\) queries to find an \(\epsilon \)-stationary point. The lower bound is tight, and establishes that stochastic gradient descent is minimax optimal in this model. In a more restrictive model where the noisy gradient estimates satisfy a mean-squared smoothness property, we prove a lower bound of \(\epsilon ^{-3}\) queries, establishing the optimality of recently proposed variance reduction techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call