Abstract

Since entropy has several applications in the information theory, such as, for example, in bi-level or multi-level thresholding of images, it is interesting to investigate the generalized additivity of Kaniadakis entropy for more than two systems. Here we consider the additivity for three, four and five systems, because we aim applying Kaniadakis entropy to such multi-level analyses.

Highlights

  • As discussed in [1], in the last twelve years several researches had been made on foundations and applications of a generalized statistical theory, based on the κ-distribution of probabilities. This distribution provides an entropy, the κ-entropy, which is known as the Kaniadakis entropy, named after Giorgio Kaniadakis, Politecnico di Torino, who proposed it and the κ-distribution [2]

  • Like the well-known Tsallis entropy [3], the κ-entropy is a generalization of that proposed by Shannon in 1948

  • Since Shannon and Tsallis entropies are largely used for bi-level and multi-level thresholding in image processing [4,5,6,7], it could be interesting to use Kaniadakis entropy for this purpose too

Read more

Summary

Introduction

As discussed in [1], in the last twelve years several researches had been made on foundations and applications of a generalized statistical theory, based on the κ-distribution of probabilities. This distribution provides an entropy, the κ-entropy, which is known as the Kaniadakis entropy, named after Giorgio Kaniadakis, Politecnico di Torino, who proposed it and the κ-distribution [2]. In the limit 0, Kaniadakis entropy becomes Shannon entropy, and we must have the normal additivity: SA. Let us rewrite (9), to remark that it is symmetric when :

For three-levels
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call