Abstract

This chapter embarks on a brief discussion of the mathematical theory of Benford's law. This law is the observation that in many collections of numbers, be they mathematical tables, real-life data, or combinations thereof, the leading significant digits are not uniformly distributed, as might be expected, but are heavily skewed toward the smaller digits. More specifically, Benford's law states that the significant digits in many data sets follow a very particular logarithmic distribution. The chapter lays out the basic theory of Benford's law before highlighting its more specific components: the significant digits and the significand (function), as well as the Benford property and its four characterizations. Finally, the chapter presents the basic theory of Benford's law in the context of deterministic and random processes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call