For a general stationary ARMA(p, q) process u we derive the exact form of the orthogonalizing matrix R such that R'R = E-1, where -= E(uu') is the covariance matrix of u, generalizing the known formulae for AR(p) processes. In a linear regression model with an ARMA(p, q) error process, transforming the data by R yields a regression model with white-noise errors. We also consider an application to semi-recursive (being recursive for the model parameters, but not for the parameters of the error process) estimation. There have been many contributions to the literature concerned with estimation of the linear regression model with some autocorrelated process in the errors, surveys of which are provided by, for example, Judge et al. [15]. The main technical difficulties arise in inverting the error covariance matrix, finding the orthogonalizing transformation for the errors, or evaluating the relevant quadratic forms in the inverse matrix by other means. Among results applicable to a general ARMA(p, q) process, we may distinguish several approaches or classes of result. The numerical approach, exemplified by Harvey and Phillips [12], seeks maximum likelihood estimates for the full model through numerical optimization of the likelihood; in [12], this is achieved by expressing quadratic forms (e.g., in the likelihood function) through recursive residuals which can be computed by the Kalman filter. One difficulty in doing so lies in the necessity of initializing the recursive filtering algorithm. The other main approach to the problem uses the inverse of the error covariance matrix to obtain generalized least-squares estimates, following estimation of the parameters of the ARMA process in the errors. Within this