For the simple problem of estimating a vector ${\bf x}^0 $ from a noisy data vector ${\bf y} = B{\bf x}^0 + {\bf e}$ where B is a known ill-conditioned $m \times n$ matrix and ${\bf e}$ is an unknown “white noise” vector, a classical regularized solution, say ${\bf x}(\tau )$ where $\tau > 0$ is the regularization parameter, can be satisfactory provided $\tau $ is well chosen. Standard data-based methods for choosing $\tau $ (like generalized cross validation, or GCV) are known to give a good estimate of the value of $\tau $ which minimizes the prediction error $||B{\bf x}(\tau ) - B{\bf x}^0 ||^2 $. In this paper, we focus on the minimization of the estimation (or reconstruction) error $||{\bf x}(\tau ) - {\bf x}^0 ||^2 $. We give sufficient conditions for the existence of two unbiased estimators of the expectation of the inner product $\langle {{\bf x}^0 ,{\bf x}(\tau )} \rangle $. This provides two estimates of the $\tau $ which minimizes $||{\bf x}(\tau ) - {\bf x}^0 ||^2 $. (The first one was proposed by Rice [Contemporary Mathematics, 59 (1986), pp. 137–151].) We compare these two estimators in the case of deconvolution problems. In theory, the second estimator no longer has the possibly “infinite” variance of the first one; however, both are likely to produce frequent dramatic undersmoothing. Then we propose a third class of estimators based on automatic stabilization procedures, which are much more efficient in many deconvolution problems. This new approach for choosing regularization parameters can significantly improve on GCV especially for “severely” ill-conditioned problems. This is easily shown by analyzing a simple example and is confirmed by numerical simulations with different degrees of ill-conditioning.