SUMMARY The regression model with autoregressive-moving average disturbances may be cast in a form suitable for the application of Kalman filtering techniques. This enables the generalized least squares estimator to be calculated without evaluating and inverting the covariance matrix of the disturbances. The problem of forecasting future values of the dependent variable is also effectively solved when the Kalman filter technique is applied. Furthermore, the properties of the residuals produced by the filter suggest that they may be useful for diagnostic checking of the model. The Kalman filter algorithm also forins the basis of a method for the exact maximum likelihood estimation of the model. This may well have computational, as well as theoretical, advantages over other methods. In a general formulation of the linear regression model, the disturbances may be assumed to be generated by an autoregressive-moving average process. Most work, however, has been restricted to the estimation of models with purely autoregressive disturbances. Models with autoregressive-moving average, or even pure moving average, errors have been much less popular, the main reason for this being computational difficulties. At first sight, it is necessary to evaluate the covariance matrix of the disturbances, and then to invert it. Since the matrix is n x n, where n is the sample size, this can be time consuming, although, as Akaike (1973) and Galbraith & Galbraith (1974) have shown, the special form of the covariance matrix means that it can be inverted relatively efficiently. Maximum likelihood estimation of the regression model with autoregressive-moving average errors has been considered by Pierce (1971). The method of estimation he adopts is basically an extension of the familiar 'conditional sum of squares' approach employed by Box & Jenkins (1970) in their treatment of autoregressive-moving average time series models. This method avoids any large-scale matrix inversions, but it does lead to an estimation problem which is nonlinear in the vector of regression coefficients as well as in the parameters of the autoregressive-moving average process. Furthermore, the estimators obtained in this way are only approximations to the full maximum likelihood estimators. A number of authors, for example Newbold (1974) and Osborn (1976), have recently stressed the desirability of computing estimators of autoregressive-moving average parameters using the exact likelihood function, and this view is reiterated in the context of regression models