Let K denote a compact selfadjoint operator acting on a Hilbert space H. L denotes a one dimensional selfadjoint operator also acting on H. It is shown that the eigenvalues of K and K+L interlace on the real axis. The purpose of this article is to prove the following theorem. THEOREM. Let K denote a compact, selfadjoint operator acting on a Hilbert space H. We assume that the nullspace of K is empty. Let L denote a one dimensional, selfadjoint operator, also acting on H. Between every pair of distinct, successive eigenvalues (ti, )i+l) of K there is precisely one eigenvalue of K+L in one of the intervals [2i, Rij?) or (4i, )i+l] or (4 R i+?). Every eigenvalue of multiplicity n> 1 of K is also an eigenvalue of K+L, of multiplicity n or n-1. The above theorem may be viewed as a generalization of certain theorems regarding second order differential equations. Consider, for example, (1) y' + q(x))y = 0 where q(x) is real and continuous on [0, 1], subject to the boundary conditions (a) y(O)=O, y(l)=0, (b) y(O)=0, y'(1)=O. The eigenvalues of (1)(a) and (1)(b) are real and alternate on the real axis. This fact is known. It is however an immediate consequence of the above theorem. Both problems (1)(a) and (1)(b) can be investigated by converting the differential operators into compact integral operators. The latter differ by a one dimensional operator so that the above theorem can be invoked. Received by the editors June 19, 1972. AMS (MOS) subject classifications (1970). Primary 47A55, 47B05; Secondary 34B25. 1 This research was supported by the National Science Foundation under grant GP 27960. ? American Mathematical Society 1973