Consider the problem of solving systems of linear algebraic equations varvec{A}varvec{x}=varvec{b} with a real symmetric positive definite matrix varvec{A} using the conjugate gradient (CG) method. To stop the algorithm at the appropriate moment, it is important to monitor the quality of the approximate solution. One of the most relevant quantities for measuring the quality of the approximate solution is the varvec{A}-norm of the error. This quantity cannot be easily computed; however, it can be estimated. In this paper we discuss and analyze the behavior of the Gauss-Radau upper bound on the varvec{A}-norm of the error, based on viewing CG as a procedure for approximating a certain Riemann-Stieltjes integral. This upper bound depends on a prescribed underestimate varvec{mu } to the smallest eigenvalue of varvec{A}. We concentrate on explaining a phenomenon observed during computations showing that, in later CG iterations, the upper bound loses its accuracy, and is almost independent of varvec{mu }. We construct a model problem that is used to demonstrate and study the behavior of the upper bound in dependence of varvec{mu }, and developed formulas that are helpful in understanding this behavior. We show that the above-mentioned phenomenon is closely related to the convergence of the smallest Ritz value to the smallest eigenvalue of varvec{A}. It occurs when the smallest Ritz value is a better approximation to the smallest eigenvalue than the prescribed underestimate varvec{mu }. We also suggest an adaptive strategy for improving the accuracy of the upper bounds in the previous iterations.