Abstract

Related to the letters of an alphabet, entropy means the average number of binary digits required for the transmission of one character. Checking tables of statistical data, one finds that, in the first position of the numbers, the digits 1 to 9 occur with different frequencies. Correspondingly, from these probabilities, a value for the Shannon entropy H can be determined as well. Although in many cases, the Newcomb–Benford Law applies, distributions have been found where the 1 in the first position occurs up to more than 40 times as frequently as the 9. In this case, the probability of the occurrence of a particular first digit can be derived from a power function with a negative exponent p > 1. While the entropy of the first digits following an NB distribution amounts to H = 2.88, for other data distributions (diameters of craters on Venus or the weight of fragments of crushed minerals), entropy values of 2.76 and 2.04 bits per digit have been found.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.