Abstract
How many bits of information about an integer do we learn from each of its prime factors? Trying to answer that question in a precise manner leads to an elementary information-theoretic proof of a well-known, there is a given nontrivial result in number theory, namely that Sigmaplesn log p/p ~ log n as n rarr infin where the sum is over all primes p not exceeding n. In fact, we obtain finite-n bounds that refine this limit. This result, originally proved by Chebyshev in 1852, is closely related to the celebrated prime number theorem. Our main goal is to show that basic information-theoretic arguments combined with elementary computations can be used to give a new proof [2] for Chebyshev's classical result (1). The proof follows, in part, along the lines of a heuristic argument due to Billingsley [1]. We briefly outline the connection between Chebyshev's result and Gauss' prime number theorem, and also give a brief survey of other instances where information-theoretic ideas have been employed in the context of number theory.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.