Abstract

In this paper, we modeled dependent categorical data via mutual information concept to obtain the measure of statistical dependence. We first derive the entropy and mutual information index for exponential power distribution. These concepts are important and were developed by Shannon in the context of information theory. Several literatures are already published in the case of the multivariate normal distribution. Then we extend these tools to the special case of a full symmetric of multivariate elliptical distributions. The upper bound for the entropy which is attained for the normal density is established. We further derived the nonlinear joint model for dependent random vectors that spans an elliptical vector space to enhance multivariate relationships among non-empty subsets of vectors via multivariate mutual information; based on the assumption that the subsets of each vector and their interactions can be represented in discrete form. To illustrate its application, the multivariate dependency among various sites based on dominance of some attributes were investigated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call