Abstract

In [1], we have discussed the mutual information of two random variables and how it can be obtained from entropies. We considered the Shannon entropy and the nonadditive Tsallis entropy. Here, following the same approach used in the Tsallis case, we propose a method for discussing the mutual entropy of another nonadditive entropy, the Kaniadakis entropy.

Highlights

  • In the Ref.1, we are discussing mutual information of two random variables and how it can be calculated from entropies

  • The calculus is quite simple in the case of Shannon entropy, whereas, for the nonadditive Tsallis entropy, it requires some caution

  • Following the same approach used for the Tsallis entropy, we propose a method for discussing the case concerning another nonadditive entropy, the Kaniadakis entropy, and determine its mutual entropy

Read more

Summary

Introduction

In the Ref.1, we are discussing mutual information of two random variables and how it can be calculated from entropies. The calculus is quite simple in the case of Shannon entropy, whereas, for the nonadditive Tsallis entropy, it requires some caution. Following the same approach used for the Tsallis entropy, we propose a method for discussing the case concerning another nonadditive entropy, the Kaniadakis entropy, and determine its mutual entropy. As q approaches 1, the Tsallis entropy becomes the Shannon entropy.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call