Abstract

In this chapter, we will try to explain the concept of information, discrete entropy and mutual information in details. To master on the information theory subjects, the reader should have a knowledge of probability and random variables. For this reason, we suggest to the reader to review the probability and random variables topics before studying the information theory subjects. Continuous entropy and continuous mutual information are very closely related to discrete entropy and discrete mutual information. For this reason, the reader should try to understand very well the fundamental concepts explained in this chapter, then proceed with the other chapters of the book.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call