Abstract
This paper develops a new divergence that generalizes relative entropy and can be used to compare probability measures without a requirement of absolute continuity. We establish properties of the divergence, and in particular derive and exploit a representation as an infimum convolution of optimal transport cost and relative entropy. Also included are examples of computation and approximation of the divergence, and the demonstration of properties that are useful when one quantifies model uncertainty.
Highlights
To compare different probabilistic models for a given application, one needs a notion of “distance” between the distributions
For situations that require an analysis of model form uncertainly, the quantity known as relative entropy is the most widely used such distance
This is true because relative entropy has all the attractive properties asked for in the last paragraph, and many more. (Relative entropy is not a true metric since it is not symmetric in its arguments, but owing to its other attributes it is more widely used for these purposes than any legitimate metric.)
Summary
To compare different probabilistic models for a given application, one needs a notion of “distance” between the distributions. ( by introducing a parameter one can obtain bounds that are in some sense optimal [11].) We typically interpret the integral S gdμ as a performance measure, and so we have a bound on the performance of the system under the true distribution in terms of the relative entropy distance R(μ ν ), plus a risk-sensitive performance measure under the design model. [24] makes use of an infconvolution formula analogous to the one presented above to extend type-1 Wasserstein distances to positive measures
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have