Abstract

In this Chapter, we provide basic concepts of information theory and coding theory. These topics are described to the level needed to easier understand in later chapters various topics such as quantum information theory, quantum error correction, fault-tolerant error correction, fault-tolerant computing, and QKD. The chapter starts with definitions of entropy, joint entropy, conditional entropy, relative entropy, mutual information, and channel capacity; followed by information capacity theorem. We also briefly describe the source coding and data compaction concepts. We then discuss the channel capacity of discrete memoryless channels, continuous channels, and some optical channels. After that, the fundamentals of blocks codes are introduced, such as linear block codes (LBCs), definition of generator and parity-check matrices, syndrome decoding, distance properties of LBCs, and some important coding bounds. Further, the cyclic codes are introduced. The BCH codes are described next. The RS codes, concatenated, and product codes are then described. After short summary section, a set of problems is provided for readers to get a deeper understanding of information theory and classical error correction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call