T O MOST economists, the single equation least-squares regression model, like an old friend, is tried and true. Its properties and limitations have been extensively studied and documented and are, for the most part, wellknown. Any good text in econometrics can lay out the assumptions on which common versions of the model are based and provide a reasonably coherent perhaps even a lucid discussion of problems that arise as particular assumptions are violated. A short bibliography of definitive papers on such classical problems as non-normality, heteroscedasticity, serial correlation, feedback, etc., completes the job. As with most old friends, however, the longer one knows least squares, the more one learns about it. An admiration for its robustness under departures from many assumptions is sure to grow. The admiration must be tempered, however, by an appreciation of the model's sensitivity to certain other conditions. The requirement that explanatory variables be truly independent of one another is one of these. Proper of the model's classical problems ordinarily involves two separate stages: detection and correction. The DurbinWatson test for serial correlation, combined with Cochrane and Orcutt's suggested first differencing procedure, is an obvious example.' Bartlett's test for variance heterogeneity followed by a data transformation to restore homoscedasticity is another.2 No such treatment has been developed, however, for problems that arise as multicollinearity is encountered in regression analysis. Attention will focus here on what we consider to be the first step in a proper of the multicollinearity problem its detection, or diagnosis. Economists are coming more and more to agree that the second step, correction, requires the generation of additional information.3 Just how this information is to be obtained depends largely on the tastes of an investigator and on the specifics of a particular problem. It may involve additional primary data collection, the use of extraneous parameter estimates from secondary data sources, or the application of subjective information through constrained regression, or through Bayesian estimation procedures. Whatever its source, however, selectivity and thereby efficiency in generating the added information requires a systematic procedure for detecting its need i.e., for detecting the existence, measuring the extent, and pinpointing the location and causes of multicollinearity within a set of independent variables. Measures are proposed here that, in our opinion, fill this need. The paper's basic organization can be outlined briefly as follows. In the next section the multicollinearity problem's basic, formal nature is developed and illustrated. A discussion of historical approaches to the problem follows. With this as background, an attempt is made to define multicollinearity in terms of departures from a hypothesized statistical condition, and * The authors are Associate Professor of Finance at the Sloan School of Management, M.I.T., and Assistant Professor of Business Administration at the Harvard Business School, respectively. We are indebted to Professor John R. Meyer for introducing us to the multicollinearity problem and for advice and encouragement during the present effort to place it in perspective, and to Professors John Lintner and Robert Schlaifer for their comments and criticisms. Responsibility for specific interpretations, especially erroneous ones, remains our own. This research was supported by the Institute of Naval Studies, of which both authors were members at the time the work was conducted, and by grants from the Ford Foundation to both the Sloan School of Management and the Harvard Business School. Computation time and facilities were provided by the Computation Centers of Harvard and M.I.T. 1J Durbin and G. S. Watson, Testing for Serial Correlation in Least Squares Regression, Biometrika, 37-38, (1950-1951); and C. Cochrane and G. H. Orcutt, Application of Least Squares Regression to Relationships Containing Autocorrelated Error Terms Journal of the American Statistical Association, 44 (1949). 2F. David and J. Neyman, Extension of the Markoff Theorem on Least Squares, Statistical Research Memoirs, II (London, 1938). 'J. Johnston, Econometric Methods (McGraw-Hill, 1963), 207; J. Meyer and R. Glauber, Investment Decisions, Economic Forecasting, and Public Policy (Division of Research, Graduate School of Business Administration, Harvard University, 1964), 181 ff.