Abstract

Information theoretic measures such as Mutual Information are often said to be able to measure nonlinear dependencies whereas covariance (and correlation) are able to measure only linear dependencies. We aim to illustrate this claim using centered random variables. The set of centered random variable Fc = {−q−12,−q−12+1,...,q−12−1,q−12} is mapped from F = {1,2, ..., q − 1, q}. For q=2, we derive the relationship between the Mutual Information function, I, and the covariance function, Γ, and show that Γ=0→I=0. Furthermore we show that when q=3, the nonlinearities are captured by Mutual Information by highlighting a case where Γ=0 ⇸ I=0.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call