Gaussian processes are commonly used for modeling the output of deterministic computer models. We consider the behavior of maximum likelihood estimators (MLEs) of parameters of the commonly used squared exponential covariance function when the computer model has some simple deterministic form. We prove that for regularly spaced observations on the line, the MLE of the scale parameter converges to zero if the computer model is a constant function and diverges to infinity for linear functions. When observing successive derivatives of a $p$th order monomial at zero, we find the asymptotic orders of the MLE of the scale parameter for all $p\ge 0$. For some commonly used test functions, we compare the MLE with cross validation in a prediction problem and explore the joint estimation of range and scale parameters. The correlation matrix is nearly numerically singular even when the sample size is moderate. To overcome numerical difficulties, we perform exact computation by making use of exact results for the correlation matrix and restricting ourselves to parameter values and test functions that yield rational correlations and function values at the observation locations. We also consider the common approach of including a nugget effect to deal with the numerical difficulties, and explore its consequences on model fitting and prediction.