In the empirical situation we assume that a random parameter ( occurs according to an unknown distribution G(O). We then observe the value of a random variable X according to a conditional distribution F(x I), where 0 is the unknown value of 0. We require an estimator for 0 which has small expected squared error. We assume further that this is a routinely reoccurring situation and that this is its nth occurrence. We can therefore base our estimate on the values of all n observations, xl, x2, ..., Xn. It should be emphasized here that we require an estimate of fn which is the parameter in the distribution of only the last observation. The empirical Bayes approach to the problem was introduced by Robbins (1955) and neatly summarized by Rutherford & Krutchkoff (1969). One obtains the Bayes estimator for the situation, the mean of the posterior distribution, in a form which does not contain the prior distribution explicitly. Generally, the form contains marginal density or mass functions which can be estimated using xj, x2, ..., xn. As was shown by Clemmer & Krutchkoff (1968), available estimators for density functions afford the empirical Bayes procedures excellent small sample properties. On the other hand, Maritz (1966) showed that for the discrete case the usual estimator for mass functions gives the empirical Bayes procedures poor small sample properties. He suggests smoothing the estimates for discrete probabilities. Here we present a simple general procedure for smoothing estimates of discrete probabilities and demonstrate its effect on the small sample properties of the empirical Bayes procedure for the Poisson situation.