Abstract

We showcase some unexplored connections between saddlepoint approximations, measure transportation, and some key topics in information theory. To bridge these different areas, we review selectively the fundamental results available in the literature and we draw the connections between them. First, for a generic random variable we explain how the Esscher’s tilting (which is a result rooted in information theory and lies at the heart of saddlepoint approximations) is connected to the solution of the dual Kantorovich problem (which lies at the heart of measure transportation theory) via the Legendre transform of the cumulant generating function. Then, we turn to statistics: we illustrate the connections when the random variable we work with is the sample mean or a statistic with known (either exact or approximate) cumulant generating function. The unveiled connections offer the possibility to look at the saddlepoint approximations from different angles, putting under the spotlight the links to convex analysis (via the notion of duality) or differential geometry (via the notion of geodesic). We feel these possibilities can trigger a knowledge transfer between statistics and other disciplines, like mathematics and machine learning. A discussion on some topics for future research concludes the paper.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call