Abstract

We introduce a new method for obtaining quantitative convergence rates for the central limit theorem (CLT) in a high-dimensional setting. Using our method, we obtain several new bounds for convergence in transportation distance and entropy, and in particular: (a) We improve the best known bound, obtained by the third named author (Probab. Theory Related Fields 170 (2018) 821–845), for convergence in quadratic Wasserstein transportation distance for bounded random vectors; (b) we derive the first nonasymptotic convergence rate for the entropic CLT in arbitrary dimension, for general log-concave random vectors (this adds to (Ann. Inst. Henri Poincaré Probab. Stat. 55 (2019) 777–790), where a finite Fisher information is assumed); (c) we give an improved bound for convergence in transportation distance under a log-concavity assumption and improvements for both metrics under the assumption of strong log-concavity. Our method is based on martingale embeddings and specifically on the Skorokhod embedding constructed in (Ann. Inst. Henri Poincaré Probab. Stat. 52 (2016) 1259–1280).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call