If a linear regression if fit to log-transformed mortalities and the estimate is back-transformed according to the formula EeY = eΜ+Σ2/2 a systematic bias occurs unless the error distribution is normal and the scale estimate is gauged to normal variance. This result is a consequence of the uniqueness theorem for the Laplace transform. We determine the systematic bias of minimum-L2 and minimum-L1 estimation with sample variance and interquartile range of the residuals as scale estimates under a uniform and four contaminated normal error distributions. Already under innocent looking contaminations the true mortalities may be underestimated by 50% in the long run. Moreover, the logarithmic transformation introduces an instability into the model that results in a large discrepancy between rg_Huber estimates as the tuning constant regulating the degree of robustness varies. Contrary to the logarithm the square root stabilizes variance, diminishes the influence of outliers, automatically copes with observed zeros, allows the “nonparametric” back-transformation formula EY2 = Μ2+Σ2 and, in the homoskedastic case, avoids a systematic bias of minimum-L2 estimation with sample variance. For the company-specific table 3 of [Loeb94], in the age range of 20–65 years, we fit a parabola to root mortalities by minimum-L2, minimum-L1, and robust rg_ Huber regression estimates, and a cubic and exponential by least squares. The fits thus obtained in the original model are excellent and practically indistinguishable by a X2 goodness-of-fit test. Finally, dispensing with the transformation of observations, we employ a Poisson generalized linear model and fit an exponential and a cubic by maximum likelihood.