Abstract

Abstract There has been a lot of work on measures of dependence or association for bivariate probability distributions or bivariate data. These measures usually assume that the variables are both continuous or both categorical. In comparison, there is very little work on multivariate or conditional measures of dependence. The purpose of this article is to discuss measures of multivariate dependence and measures of conditional dependence based on relative entropies. These measures are conceptually very general, as they can be used for a set of variables that can be a mixture of continuous, ordinal-categorical, and nominal-categorical variables. For continuous or ordinal-categorical variables, a certain transformation of relative entropy to the interval [0, 1] leads to generalizations of the correlation, multiple-correlation, and partial-correlation coefficients. If all variables are nominal categorical, the relative entropies are standardized to take a maximum of 1 and then transformed so that in the bivar...

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call