Abstract

We develop the theory of discrete-time gradient flows for convex functions on Alexandrov spaces with arbitrary upper or lower curvature bounds. We employ different resolvent maps in the upper and lower curvature bound cases to construct such a flow, and show its convergence to a minimizer of the potential function. We also prove a stochastic version, a generalized law of large numbers for convex function valued random variables, which not only extends Sturm’s law of large numbers on nonpositively curved spaces to arbitrary lower or upper curvature bounds, but this version seems new even in the Euclidean setting. These results generalize those in nonpositively curved spaces (partly for squared distance functions) due to Bacak, Jost, Sturm and others, and the lower curvature bound case seems entirely new.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call