In this paper, motivated by the approximation of Martingale Optimal Transport problems, we are interested in sampling methods preserving the convex order for two probability measures µ and ν on ℝd, with ν dominating µ. When (Xi)1≤i≤I (resp. (Yj)1≤j≤J ) are independent and identically distributed according µ (resp. ν), in general µI = 1/1 ΣIi=1δXi and νJ = 1/J ΣJj=1δYj are not rankable for the convex order. We investigate modifications of µI (resp. νJ) smaller than νJ (resp. greater than µI) in the convex order and weakly converging to µ (resp. ν) as I, J → ∞. We first consider the one dimensional case d = 1, where, according to Kertz and Rosler, the set of probability measures with a finite first order moment is a lattice for the increasing and the decreasing convex orders. Given µ and ν in this set, we define µ ∨ ν (resp. µ ∧ v) as the supremum (resp. infimum) of µ and ν for the decreasing convex order when ∫ℝ xµ(dx) ≤ ∫ℝ xν(dx) and for the increasing convex order otherwise. This way, µ∨v (resp. µ∧ν) is greater than µ (resp. smaller than ν) in the convex order. We give efficient algorithms permitting to compute µ ∨ ν and µ ∧ ν (and therefore µI ∨ νJ and µI ∧ νJ ) when µ and ν are convex combinations of Dirac masses. In general dimension, when µ and ν have finite moments of order ϱ≥ 1, we define the projection µ ⋏ϱν (resp. µ ⋎ϱν) of µ (resp. ν) on the set of probability measures dominated by ν (resp. larger than µ) in the convex order for the Wasserstein distance with index ϱ. When ϱ= 2, µI ⋏2 νJ can be computed efficiently by solving a quadratic optimization problem with linear constraints. It turns out that, in dimension d = 1, the projections do not depend on ϱ and their quantile functions are explicit in terms of those of µ and ν, which leads to efficient algorithms for convex combinations of Dirac masses. Last, we illustrate by numerical experiments the resulting sampling methods that preserve the convex order and their application to approximate Martingale Optimal Transport problems.
Read full abstract