Abstract
Recently, a new measure of information called extropy has been introduced by Lad, Sanfilippo and Agrò as the dual version of Shannon entropy. In the literature, Tsallis introduced a measure for a discrete random variable, named Tsallis entropy, as a generalization of Boltzmann–Gibbs statistics. In this work, a new measure of discrimination, called Tsallis extropy, is introduced and some of its properties are then discussed. The relation between Tsallis extropy and entropy is given and some bounds are also presented. Finally, an application of this extropy to pattern recognition is demonstrated.
Submitted Version (
Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have