Among errors frequently made by students in a calculus course are false assumptions such as following: (a + b)2 = a2 + b2, sin(a + b) = sin a + sin b, log(a + b) = log a + log b, for all real a and b. These would seem to be examples of the student's tendency to overgeneralize properties of operations, in this case distributive property, as discussed by Shumway [3]. Here function operator is assumed to distribute real operation of addition. Typically, one points out mistake of such assumptions by providing counterexamples or instances. Shumway's review of literature finds that this strategy is not always successful and, in fact, that use of negative instances can have a detrimental effect on concept learning. One might then wish to try an alternative approach and address question directly: What functions f, if any, have property f(a + b) = f(a) + f(b) for all real a and b? The problem, identified as Cauchy's equation, has received a great deal of attention over years. Young [4] makes reference to problem as one frequently posed in advanced calculus. Aczel [1] provides an interesting historical account of problem. In both of these works, emphasis seems to be on looking for solutions to problem under conditions of continuity or boundedness of function f on a closed interval. Arguments under these conditions require a degree of sophistication generally identified with upper level undergraduate mathematics students. If, however, one is willing to require stronger condition of differentiability, then solution to problem is accessible to calculus student. Since most of functions in elementary calculus are differentiable at all but perhaps a few points, it seems reasonable to work under this condition. The problem then which will be considered here is: What differentiable