The history of computers, particularly digital ones, dates from the first quarter of the 17th century (see ref. 1 for review). The first known machine was built by Wilhelm Schickard, a professor at Tubingen and a friend of Kepler's. Interestingly, this occurred at the same time that Napier invented logarithms. The device was built, but it, as well as the copy for Kepler, and the inventor himself were destroyed by the fires and plagues of the Thirty Years' War. The next machine, copies of which still exist, was built by Pascal and was described in Diderot's Encylopedie. This device became an important part of a desk calculator designed and constructed by Leibniz who said: Also the astronomers surely will not have to continue to exercise the patience which is required for computation. It is this that deters them from computing or correcting tables, from the construction of Ephemerides, from working on hypotheses, and from discussions of observations with each other. For it is unworthy of excellent men to lose hours like slaves in the labor of calculation which could safely be relegated to anyone else if machines were used. This enunciation by Leibniz of a purpose for automatic computing is a memorable one. Science, or at least mathematical astronomy, had advanced sufficiently by his time so that calculation was a real burden and it was recognized to some extent how this burden could be lightened. Certainly Kepler, using tables of logarithms he himself calculated based on Napier's schema, did extensive calculations in order to produce his Rudolphine Tables. The time for the digital principle, however, had still not come, and even by the early part of the 19th century the Nautical Almanac was being calculated by groups of humans all busy making separate and independent calculations with attendant errors. The situation was so bad by 1823 that Charles Babbage, one of the founders of the Royal Astronomical Society in 1820, set out to create a digital device to construct tables by a method, certainly well-known to Newton, called subtabulation. In this method, one first calculates a comparatively few entries in a table by hand a priori, and then the entries lying between them are filled in by systematic interpolation using essentially only additions and subtractions. For various reasons this was a propitious time in English history to attempt to automate computation. Trevelyan (2) tells
Read full abstract