Abstract

The total variation distance is a core statistical distance between probability measures that satisfies the metric axioms, with value always falling in [0,1]. Since the total variation distance does not admit closed-form expressions for statistical mixtures, one often has to rely in practice on costly numerical integrations or on fast Monte Carlo approximations that however do not guarantee deterministic bounds. In this work, we consider two methods for bounding the total variation of univariate mixture models: The first is based on the information monotonicity property of the total variation to design guaranteed nested deterministic lower bounds. The second method relies on computing the geometric lower and upper envelopes of weighted mixture components to derive deterministic bounds based on density ratio. We demonstrate the tightness of our bounds through simulating Gaussian, Gamma and Rayleigh mixture models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.