Abstract

Information Noise Contrastive Estimation (InfoNCE) is a popular neural estimator of mutual information (MI). While InfoNCE has demonstrated impressive results in representation learning, the estimation can be significantly off. While the original estimator is known to underestimate the MI due to the logn upper bound, where n is the sample size, we show that some subsequent fix can cause the MI estimate to overshoot apparently without any bound. We propose a novel MI variational estimator, smoothed InfoNCE, that resolves the issues by smoothing out the contrastive estimation. Experiments on high-dimensional Gaussian data confirm that the proposed estimate can break the logn curse without overshooting.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call