Abstract
Optimal Transport (OT) is dedicated to solving how to transform one measure to another with least cost. OT has gained wide applications in machine learning. However, the heavy computation burden of primal OT distance makes it prohibitive for prevalent high-dimensional problems. Recent work imposes an entropic regularization term on primal OT, obtaining a strictly convex problem that can be addressed with stochastic optimization. We focus on the optimization methods for discrete OT and semi-discrete OT with regularization. Instead of the initial SAG method for discrete OT, we apply SAGA, which solves the biased gradient estimation problem, and SVRG, which solves the memory problem. Besides, we define the regret in semi-discrete OT problem and solve it from the perspective of online learning. Our study, to our best knowledge, is the first effort to solve the semi-discrete OT problem with follow-the-regularized-leader thought. Our FTRL-\textit{current} algorithm shows faster convergence than existing algorithms for solving semi-discrete OT problem.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have