This paper compares the natural-conjugate Bayes and the maximum likelihood estimators of the linear regression model coefficients in terms of matrix mean squared error. The principal result is an inequality involving the unknown parameters. Proposals are made for testing if this condition holds. The weakest involves estimation of the unknown parameters. Another suggestion exploits the relationship between the Bayes estimator and the constrained least squares estimator to derive weak classical hypothesis tests. The third approach uses the prior density for the parameters and Bayesian decision theory to determine the posterior probability that the mean squared error inequality is satisfied.