Abstract
Although Statistical Mechanics and Information theory[1], [2] have some common origins, (Shannon borrowed the notion of entropy from Statistical Mechanics and conversely some people have tried to formulate Statistical Mechanics starting from Information theory) the two branches of science have evolved completely independently one from the other. The point we will try to make in this talk is that the two subjects are much more related than it is generally believed. There are deep connections between error-correcting codes and some theoretical models of disordered spin systems3. In particular we will show that maximum likelihood decoding of error-correcting codes is mathematically equivalent to finding the ground state of a certain spin hamiltonian. We will also explicitly construct a code which is optimal, i.e. allows error-free communication provided the transmission rate does not exceed channel capacity (see later). This code corresponds to an exactly soluble spin-glass model, the “random energy model”, proposed by Derrida. There is only one other known explicit code which is optimal, the “pulse position modulation” or “ppm” codes. Finally we will show that generalized spinglass models, where the “spins” are appropriately chosen matrices (they form a representation of a finite group), form a new class of codes which could be potentially interesting for practical applications. These codes can be thought of as generalizations of ppm codes.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.