Abstract

In nonsmooth stochastic optimization, we establish the nonconvergence of the stochastic subgradient descent (SGD) to the critical points recently called active strict saddles by Davis and Drusvyatskiy. Such points lie on a manifold M, where the function f has a direction of second-order negative curvature. Off this manifold, the norm of the Clarke subdifferential of f is lower-bounded. We require two conditions on f. The first assumption is a Verdier stratification condition, which is a refinement of the popular Whitney stratification. It allows us to establish a strengthened version of the projection formula of Bolte et al. for Whitney stratifiable functions and which is of independent interest. The second assumption, termed the angle condition, allows us to control the distance of the iterates to M. When f is weakly convex, our assumptions are generic. Consequently, generically, in the class of definable weakly convex functions, SGD converges to a local minimizer. Funding: The work of Sholom Schechtman was supported by “Région Ile-de-France”.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call