Abstract

In the famous paper in which he introduced what is now known as the Turing machine, Alan Turing gave a definition of computable real numbers under which it turns out that multiplication by 3 is uncomputable. This shortcoming vanished in a Correction to his paper that Turing himself published shortly afterwards, but it clearly illustrates the subtlety of defining computability issues correctly. In this paper, we give the name “printable” to real numbers that Turing originally called “computable”, we recall what is now the generally accepted definition of computable real numbers (which is not quite Turing's amended definition, but is equivalent to it), and we contrast the two notions. Despite the fact that the multiplication by 3 of printable numbers is uncomputable, as opposed to the same operation on computable numbers, a real number is computable if and only if it is printable. The resolution of this apparent paradox is that no machine can transform the “computable” description of a real number to its “printable” description, as Turing proved in his Correction. Finally, we address the subtle issue of allowing or not the printable description of a real number to end with an infinite sequence of 9s (or of 1s in binary), which was left open by Turing in his Correction. Several of these results were already known, as they appear in scattered places, some in non-refereed publications, but we give a unified treatment with some different proofs and a historical perspective.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call