Abstract
We propose a generalized cumulative residual information measure based on Tsallis entropy and its dynamic version. We study the characterizations of the proposed information measure and define new classes of life distributions based on this measure. Some applications are provided in relation to weighted and equilibrium probability models. Finally the empirical cumulative Tsallis entropy is proposed to estimate the new information measure.
Highlights
Shannon [1] introduced the concept of entropy which is widely used in the fields of communication theory, information theory, physics, economics, probability and statistics, and so forth.Let X be a random variable having probability density function f(x), survival function F(x), and hazard rate r(x) =f(x)/F(x); Shannon defined entropy for a random variable X as ∞H (X) = − ∫ f (x) log (f (x)) dx. (1)For a residual lifetime Xt = [X − t | X > t], where t > 0, Ebrahimi [2] defined an entropy as a dynamic measure of uncertainty which is given by H (X; t) = −∫f (x) log (f (x) )
We propose a generalized cumulative residual information measure based on Tsallis entropy and its dynamic version
We study the characterizations of the proposed information measure and define new classes of life distributions based on this measure
Summary
Shannon [1] introduced the concept of entropy which is widely used in the fields of communication theory, information theory, physics, economics, probability and statistics, and so forth. Let X be a random variable having probability density function f(x), survival function F(x), and hazard rate r(x) =. F(x)/F(x); Shannon defined entropy for a random variable X as ∞. For a residual lifetime Xt = [X − t | X > t], where t > 0, Ebrahimi [2] defined an entropy as a dynamic measure of uncertainty which is given by H (X; t) = −∫.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have