Abstract

The distance that compares the difference between two probability distributions plays a fundamental role in statistics and machine learning. Optimal transport (OT) theory provides a theoretical framework to study such distances. Recent advances in OT theory include a generalization of classical OT with an extra entropic constraint or regularization, called entropic OT. Despite its convenience in computation, entropic OT still lacks sufficient theoretical support. In this paper, we show that the quadratic cost in entropic OT can be upper-bounded using entropy power inequality (EPI)-type bounds. First, we prove an HWI-type inequality by making use of the infinitesimal displacement convexity of the OT map. Second, we derive two Talagrand-type inequalities using the saturation of EPI that corresponds to a numerical term in our expressions. These two new inequalities are shown to generalize two previous results obtained by Bolley et al. and Bai et al. Using the new Talagrand-type inequalities, we also show that the geometry observed by Sinkhorn distance is smoothed in the sense of measure concentration. Finally, we corroborate our results with various simulation studies.

Highlights

  • Optimal transport (OT) theory studies how to transport one measure to another in the path with minimal cost

  • We show that the geometry observed by Sinkhorn distance is smoothed in the sense of measure concentration

  • We further show that the uncertainty smooths the geometrical properties of Wasserstein distance, i.e., Sinkhorn distance implies a looser measure concentration inequality

Read more

Summary

Introduction

OT theory studies how to transport one measure to another in the path with minimal cost. The Wasserstein distance is the cost given by the optimal path and closely connected with information measures; see, e.g., [1,2,3,4,5]. Due to the extra entropic constraint in the domain of the optimization problem, randomness is added to the original deterministic system, and the total cost increases from the original Wasserstein distance to a larger value. A natural question is how to quantify the extra cost caused by the entropic constraint. We derive upper bounds for the quadratic cost of entropic OT, which are shown to include a term of entropy power responsible for quantifying the amount of uncertainty caused by the entropic constraint. This work is an extended version of [11]

Literature Review
Contributions
Notation
Organization of the Paper
Synopsis of Optimal Transport
Measure Concentration
Entropy Power Inequality and Deconvolution
Main Theoretical Results
Numerical Simulations
Conclusions and Future Directions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call