Abstract

We propose and discuss two information-based measures of statistical dispersion of positive continuous random variables: the entropy-based dispersion and Fisher information-based dispersion. Although standard deviation is the most frequently employed dispersion measure, we show, that it is not well suited to quantify some aspects that are often expected intuitively, such as the degree of randomness. The proposed dispersion measures are not entirely independent, though each describes the quality of probability distribution from a different point of view. We discuss relationships between the measures, describe their extremal values and illustrate their properties on the Pareto, the lognormal and the lognormal mixture distributions. Application possibilities are also mentioned.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.