Abstract

The information-bottleneck (IB) principle is defined in terms of mutual information. This study defines mutual information between two random variables using the Jensen-Shannon (JS) divergence instead of the standard definition which is based on the Kullback-Leibler (KL) divergence. We reformulate the information-bottleneck principle using the proposed mutual information and apply it to the problem of pairwise clustering. We show that applying IB to clustering tasks using JS divergences instead of KL yields improved results. This indicates that JS-based mutual information has an expressive power at least as the standard KL-based mutual information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call