We analyze Feynman's work on the response of an amplifier performed at Los Alamos and described in a technical report of 1946, as well as lectured on at the Cornell University in 1946-47 during his course on Mathematical Methods. The motivation for such a work was Feynman's involvement in the Manhattan Project, for which the necessity emerged of feeding the output pulses of counters into amplifiers or several other circuits, with the risk of introducing distortion at each step. In order to deal with such a problem, Feynman designed a theoretical "reference amplifier", thus enabling a characterization of the distortion by means of a benchmark relationship between phase and amplification for each frequency, and providing a standard tool for comparing the operation of real devices. A general theory was elaborated, from which he was able to deduce the basic features of an amplifier just from its response to a pulse or to a sine wave of definite frequency. Moreover, in order to apply such a theory to practical problems, a couple of remarkable examples were worked out, both for high-frequency cutoff amplifiers and for low-frequency ones. A special consideration deserves a mysteriously exceptional amplifier with best stability behavior introduced by Feynman, for which different physical interpretations are here envisaged. Feynman's earlier work then later flowed in the Hughes lectures on Mathematical Methods in Physics and Engineering of 1970-71, where he also remarked on causality properties of an amplifier, that is on certain relations between frequency and phase shift that a real amplifier has to satisfy in order not to allow output signals to appear before input ones. Quite interestingly, dispersion relations to be satisfied by the response function were introduced.
Read full abstract