Abstract

Abstract Information theory, statistical theory of signal transmission, communication theory and Shannon theory are synonymous names for the mathematical theory first published by Claude Shannon in the 1940s. The concept of entropy in this theory can be viewed as a measure of ‘randomness’ of sequences of symbols. Shannon entropy and its variants have been widely used in molecular biology and bioinformatics as statistical tools of choice for sequence and structure analyses.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call