Abstract

Inspired by the recent paper (L. Ying, Journal of Scientific Computing, 84, 1–14 (2020), we explore the relationship between the mirror descent and the variable metric method. When the metric in the mirror decent is induced by a convex function, whose Hessian is close to the Hessian of the objective function, this method enjoys both robustness from the mirror descent and superlinear convergence for Newton type methods. When applied to a linearly constrained minimization problem, we prove the global and local convergence, both in the continuous and discrete settings. As applications, we compute the Wasserstein gradient flows and Cahn-Hillard equation with degenerate mobility. When formulating these problems using a minimizing movement scheme with respect to a variable metric, our mirror descent algorithm offers a fast convergence speed for the underlying optimization problem while maintaining the total mass and bounds of the solution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call