Abstract

AbstractWe introduce and study the cumulative information generating function, which provides a unifying mathematical tool suitable to deal with classical and fractional entropies based on the cumulative distribution function and on the survival function. Specifically, after establishing its main properties and some bounds, we show that it is a variability measure itself that extends the Gini mean semi-difference. We also provide (i) an extension of such a measure, based on distortion functions, and (ii) a weighted version based on a mixture distribution. Furthermore, we explore some connections with the reliability of k-out-of-n systems and with stress–strength models for multi-component systems. Also, we address the problem of extending the cumulative information generating function to higher dimensions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call