Abstract

In mathematical statistics education, we can use mutual information as a tool for evaluating the degree of dependency between two random variables. The ordinary correlation coefficient provides information only on linear dependency, not on nonlinear relationship between two random variables if any. In this paper as a measure of the degree of dependency between random variables, we suggest the use of symmetric uncertainty and <TEX>${\lambda}$</TEX> which are defined in terms of mutual information. They can be also considered as generalized correlation coefficients for both linear and non-linear dependence of random variables.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.