Abstract

We prove convergence rates of Zeroth-order Gradient Descent (ZGD) algorithms for Łojasiewicz functions. Our results show that for smooth Łojasiewicz functions with Łojasiewicz exponent larger than 0.5 and smaller than 1, the functions values can converge much faster than the (zeroth-order) gradient descent trajectory. Similar results hold for convex nonsmooth Łojasiewicz functions. History: Accepted by Antonio Frangioni, Area Editor for Design & Analysis of Algorithms–Continuous. Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2023.0247 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2023.0247 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call