Abstract

In the information theory community, the following “historical” statements are generally well accepted: (1) Hartley did put forth his rule twenty years before Shannon; (2) Shannon’s formula as a fundamental tradeoff between transmission rate, bandwidth, and signal-to-noise ratio came out unexpected in 1948; (3) Hartley’s rule is inexact while Shannon’s formula is characteristic of the additive white Gaussian noise channel; (4) Hartley’s rule is an imprecise relation that is not an appropriate formula for the capacity of a communication channel. We show that all these four statements are somewhat wrong. In fact, a careful calculation shows that “Hartley’s rule” in fact coincides with Shannon’s formula. We explain this mathematical coincidence by deriving the necessary and sufficient conditions on an additive noise channel such that its capacity is given by Shannon’s formula and construct a sequence of such channels that makes the link between the uniform (Hartley) and Gaussian (Shannon) channels.

Highlights

  • As researchers in information theory, we all know that the milestone event that founded our field is Shannon’s publication of his seminal 1948 paper [1] that created a completely new branch of applied mathematics and called it to immediate worldwide attention

  • The difference between this formula and (1) is essentially the content of the sampling theorem, often referred to as Shannon’s theorem, that the number of independent samples that can be put through a channel of bandwidth W hertz is 2W samples per second

  • Hartley [5] was the first researcher to try to formulate a theory of the transmission of information

Read more

Summary

Introduction

As researchers in information theory, we all know that the milestone event that founded our field is Shannon’s publication of his seminal 1948 paper [1] that created a completely new branch of applied mathematics and called it to immediate worldwide attention. The reason for which Hartley’s name is associated to the theorem is commonly justified by the so-called Hartley’s law, which is described as follows: During 1928, Hartley formulated a way to quantify information and its line rate ( known as data signalling rate R bits per second) [5]. This method, later known as Hartley’s law, became an important precursor for Shannon’s more sophisticated notion of channel capacity.

Hartley’s Rule is not Hartley’s
Independent 1948 Derivations of Shannon’s Formula
Hartley’s Rule as a Capacity Formula
Conditions for Shannon’s Formula to Hold
B-Spline Channels of Any Degree
Wikipedia
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call