Abstract

AbstractErwin Schrödinger posed—and to a large extent solved—in 1931/32 the problem of finding the most likely random evolution between two continuous probability distributions. This article considers this problem in the case when only samples of the two distributions are available. A novel iterative procedure is proposed, inspired by Fortet‐IPF‐Sinkhorn type algorithms. Since only samples of the marginals are available, the new approach features constrained maximum likelihood estimation in place of the nonlinear boundary couplings, and importance sampling to propagate the functions ϕ and solving the Schrödinger system. This method mitigates the curse of dimensionality, compared to the introduction of grids, which in high dimensions lead to numerically unfeasible methods. The methodology is illustrated in two applications: entropic interpolation of two‐dimensional Gaussian mixtures, and the estimation of integrals through a variation of importance sampling. © 2020 Wiley Periodicals LLC.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call