Abstract

In this paper we derive an inequality relating linear combinations of mutual information between subsets of mutually independent random variables and an auxiliary random variable. One choice of a family of auxiliary variables leads to a new proof of a Stam-type inequality regarding the Fisher Information of sums of independent random variables. Another choice of a family of auxiliary random variables leads to new results as well as new proofs of results relating to strong data processing constants and maximal correlation between sums of independent random variables. Other results obtained include convexity of Kullback–Leibler divergence over a parameterized path along pairs of binomial and Poisson distributions, as well as a new duality-based argument relating the Stam-type inequality and entropy power inequality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call