A functional form often employed in cost-estimating relationships is Y = otYP, where Y and X are the cost and predictor variables, respectively, and a and (3 are parameters to be estimated from empirical data. Logarithmic transformation of either side of this intrinsically linear relationship leads to 7* a* + fL¥*, with the asterisks denoting logs. What typically happens in practice is that ordinary least squares (OLS) regression is applied to a set of observations on Y* and X*, generating estimates of a* and p. From a statistical point of view, however, justification of this process requires the presence of a multiplicative random error term in the untransformed model. In addition, the multiplicative structure gives rise to variance in cost that changes systematically with X. If the error term is additive, the variance in cost will be constant, but the parameters in that model must be estimated by nonlinear least squares (NLS) applied to the untransformed variables. In general, OLS and NLS lead to different parameter estimates and different cost predictions, and frequently analysts are unsure as to which method is correct. This paper reports on a Monte Carlo study of the accuracy of each estimation method under correct and incorrect error specifications. The multi-predictor model that serves as the basis of the study is representative of models with which cost analysts frequently deal. Results suggest that OLS, applied to the logs of the variables, may be the preferred method under either specification, provided the estimate of a is properly developed. This finding, however, is applicable strictly to the issue of parameter estimation. Accuracy of cost prediction requires further analysis.