Assume that zero is a stable equilibrium of an ODE $\dot x=f(x,\lambda)$ for parameter values $\lambda < \lambda_0$ and becomes unstable for $\lambda > \lambda_0$. If we suppose that $\lambda(t)$ varies slowly with t, then, under some conditions, the trajectories of the nonautonomous ODE $\dot x=f(x,\lambda(t))$ stay close to zero even long after $\lambda(t)$ has crossed the value $\lambda_0$. This phenomenon is called "delayed loss of stability" and is well known for ODEs. In this paper, we describe an analogous phenomenon for delay equations of the form $\dot{x}(t)=f(t,x(t-1))$. We study an example which requires combining linearization at zero with estimates on the nonlinear behavior away from zero, and where we obtain an explicit estimate on the time until the growth of |x(t)| becomes "visible."