Abstract

Linfoot (1957) introduced an informational measure r I of correlation between two random variables X and Y . The measure r I is based on the information gain r 0 in knowing that X and Y are mutually dependent with a given bivariate density function as compared with the original knowledge that X and Y are statistically independent. In the present paper, an asymptotic form of the information measure r I , denoted by r ˜ I , is derived in terms of Pearson's (1904) chi-square for contingency tables. Hence r ˜ I is suggested as an information measure of association in contingency tables. On comparing r ˜ I with Pearson's classical coefficient of contingency P , it is found that r ˜ I ⩾ P . This is a desirable property of r ˜ I , since Lancaster and Hamdan (1964) demonstrated that P underestimates the product-moment correlation coefficient in contingency tables with broad categories. The asymptotic variance of r ˜ I is derived and compared with the asymptotic variance of P .

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.