Abstract

Publisher Summary The concept of Shannon's (1948) entropy is the central concept of information theory. Sometimes this measure is referred to as the “measure of uncertainty.” The entropy of a random variable is defined in terms of its probability distribution and can be shown to be a good measure of randomness or uncertainty. This chapter discusses new developments in generalized information measures. The entropy of a random variable is defined in terms of its probability distribution and can be shown as a good measure of randomness or uncertainty. In the past, researchers' interest tended toward one- and two-scalar parametric generalizations of the measures. The unified ( r , s )-measures are divided in two parts: (1) unified ( r , s )-information measures and (2) M-dimensional unified (r, s)-divergence measures. The measure H ( P ) is the Shannon's entropy and the measure H ( P | Q ) is the inaccuracy. The arithmetic-geometric mean divergence measure and its unified ( r , s )-generalizations are also presented in this chapter.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call