We consider the question of numerically approximating the derivative of a smooth function using only function evaluations. In particular, we examine the regression gradient, the generalized simplex gradient and the generalized centered simplex gradient, three numerical techniques based on using function values at a collection of sample points to construct ‘best-fit’ linear models. Under some conditions, these gradient approximations have error bounds dependent on the number of sample points used, the Lipschitz constant of the true gradient, and the geometry of the sample set. Perhaps counter-intuitively, as the number of sample points increases (to infinity) on a fixed domain, the error bounds can increase (to infinity). In this work, we first explore the behavior of the error bound for generalized simplex gradients of a single-variable function (f:R↦R). Thereafter, we investigate the behavior of the absolute error for these three gradient approximation techniques as the number of sample points tends to infinity. Under reasonable assumptions, we prove that the absolute errors remain bounded as the number of sample points increases to infinity on a fixed interval.