Intending to improve the numerical performance of the classic Fletcher–Reeves conjugate gradient method, Jiang and Jian embedded a shrinkage multiplier on the Fletcher–Reeves parameter which enhances its computational merits. As a matter of fact, we show that when the iterations jam, the line search turns to be approximately exact and so, search direction of the Jiang–Jian method automatically approaches the steepest descent direction which may lead to overcoming undesirable jamming effect of the classic Fletcher–Reeves method. A more detailed analysis of the numerical behavior of the Fletcher–Reeves method under the jamming phenomenon is presented as well. Then, on account of the advantages of the Jiang–Jian method, a conjugate gradient parameter is devised. Additionally, a descent spectral version of the proposed method is developed which fulfills the sufficient descent property regardless of the line search technique. Convergence of the proposed methods is addressed under classical suppositions. To gain support for our theoretical arguments, the computational merits of the given algorithms on a set of CUTEr test functions and the image restoration problem are depicted.