Abstract

In this paper, we provide three applications for f-divergences: (i) we introduce Sanov’s upper bound on the tail probability of the sum of independent random variables based on super-modular f-divergence and show that our generalized Sanov’s bound strictly improves over ordinary one, (ii) we consider the lossy compression problem which studies the set of achievable rates for a given distortion and code length. We extend the rate-distortion function using mutual f-information and provide new and strictly better bounds on achievable rates in the finite blocklength regime using super-modular f-divergences, and (iii) we provide a connection between the generalization error of algorithms with bounded input/output mutual f-information and a generalized rate-distortion problem. This connection allows us to bound the generalization error of learning algorithms using lower bounds on the f-rate-distortion function. Our bound is based on a new lower bound on the rate-distortion function that (for some examples) strictly improves over previously best-known bounds.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call