Let Y be a random vector and Z be a random variable with joint density f(y,z | θ) , where θ ∈ Θ is a vector of unknown parameters. This paper discusses minimum mean squared error (MSE) unbiased prediction of Z based on Y , and its relationships to minimum variance unbiased estimation of ψ(θ)=E[Z | θ] , the expected value of Z . A Rao–Cramer type lower bound for the MSE of an unbiased predictor is presented and a characterization of uniformly minimum MSE unbiased predictors (UMMSEUP) is discussed. When Y and Z are independent given θ , the UMMSEUP of Z and the uniformly minimum variance unbiased estimator (UMVUE) of ψ ( θ ) are shown to be identical. If the marginal model {f(y | θ), θ∈Θ} admits a complete sufficient statistic T ( Y ), we prove that (a) the UMMSEUP of Z exists if and only if Z admits an unbiased predictor and there exist two functions k and h such that E[Z | y,θ]=k(y)+h(T(y),θ) with probability 1 for all θ ∈ Θ , and (b) the UMMSEUP of Z and the UMVUE of ψ ( θ ) are the same if and only if E[Z | y,θ] depends on y only through T ( y ) with probability 1. We also discuss optimum predictions when the bias and MSE are defined conditionally on y and z , respectively. The results are applied to u – v method of estimation, prediction in mixed linear models, and estimation of the mean of a finite population under a super population model.