This note is concerned with the problem of finding the roots of the determinanta equation I A + XB I = 0. If B is nonsingular, then the problem is equivalent to finding the roots of the equation I B 'A + XI I 0. The latter problem can be solved by reduction of B'1A to Hessenberg form followed by application of Hyman's method coupled with the Laguerre algorithm to locate the eigenvalues [1]. The purpose of this note is to show that the reduction to Hessenberg form can be carried out without first reducing the problem to the standard form I A + XI I = 0. It is possible to define elementary row and column transformations with corresponding matrices P and Q such that A = PA Q is an upper Hessenberg matrix and B = PBQ is an upper triangular matrix. Since I P I = I Q I = ? 1, the eigenvalue problem I A + Xf = 0 is equivalent to the original problem. The eigenvalues of the transformed equation may be computed efficiently using the method devised by Parlett 11]. The referee for this paper called the author's attention to the fact that the general problem I A + XB I = 0 can be reduced to the standard problem I A + X2I I 0 0 even if B is singular. This apparently involves a determination of the rank of B and also a check of the linear independence of certain rows of a transformation of A. The method described herein requires almost twice the computing time that Hyman's method does when applied to the standard problem. Hyman's method is probably not as fast as the Q-R algorithm. Therefore the method described herein is probably inferior to one which reduces the problem to standard form. The reduction to standard form might introduce greater error than Hyman's method applied directly to A + XB, but we have not made any comparisons. We will now describe this reduction to Hessenberg form. Gaussian elimination with interchanges applied to the rows of B can be used to reduce B to upper triangular form. This enables us to define a matrix p(l) such that B( ' = P(')B is upper triangular (that is, b8 = 0 if j < i). Also we have I p(l) = il and I p8' I < 1. We use the notation A(') = P(1'A and denote the order of A and B by N. Next we interchange the Nth and (N 1)st rows of A) (if necessary) so that |aNl,1 I _ I a). Then we add a multiple of the (N 1)st row to the Nth row so that the element a,(') is replaced by zero. This defines a matrix p(2). Note that the matrix B(2) = P(2)B(l) is no longer triangular. It may have a nonzero element bNN-1 . By Gaussian elimination with interchanges on the last two columns of B(23 we may replace b (2)_, by zero. This defines a matrix Q(3) such that Q(3) 1 = 1 and B(2)Q'3) is triangular. Note that these column operations have no effect on the first column of A (2) thus a(3) = 0, where A(3) = A(2)Q(3). We may continue until we have reduced the first column of A to Hessenberg form. The same method will reduce the remaining columns of A to Hessenberg form leaving B in triangular form.