Abstract
We develop the theory of discrete-time gradient flows for convex functions on Alexandrov spaces with arbitrary upper or lower curvature bounds. We employ different resolvent maps in the upper and lower curvature bound cases to construct such a flow, and show its convergence to a minimizer of the potential function. We also prove a stochastic version, a generalized law of large numbers for convex function valued random variables, which not only extends Sturm’s law of large numbers on nonpositively curved spaces to arbitrary lower or upper curvature bounds, but this version seems new even in the Euclidean setting. These results generalize those in nonpositively curved spaces (partly for squared distance functions) due to Bacak, Jost, Sturm and others, and the lower curvature bound case seems entirely new.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Calculus of Variations and Partial Differential Equations
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.