Abstract

Mutual information of two random variables can be easily obtained from their Shannon entropies. However, when nonadditive entropies are involved, the calculus of the mutual information is more complex. Here we discuss the basic matter about information from Shannon entropy. Then we analyse the case of the generalized nonadditive Tsallis entropy.

Highlights

  • In many applications of engineering and telecommunication, it is often desired to increase or decrease the dependency of two random variables

  • The mutual information can be decomposed into a sum of entropies [1], when the Shannon entropy is used

  • Due to its entropic index, which can be used as tuning parameter, this entropy is involved in several applications, in particular for image processing and image registration [6]

Read more

Summary

Introduction

In many applications of engineering and telecommunication, it is often desired to increase or decrease the dependency of two random variables. This dependency is linked to the mutual information, which is its measure. When nonadditive entropies are involved, the approach to find the mutual information is not so simple. We discuss the basic matter concerning the mutual information when Tsallis entropy is involved.

Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.