Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems. Since they have the attractive practical factors of simple computation and low memory requirement, interesting theoretical features of curvature information and strong global convergence property. Based on the analysis of the minimization of the condition number and the positiveness of the corresponding matrix, we propose a choice for the parameter in Dai-Liao method and design a descent conjugate gradient algorithm which owns the sufficient descent property independent of the choices for line search techniques. Under some common conditions, the global convergence property for uniformly convex function and the general nonlinear function are established. In the numerical experiments, we firstly focus on 46 ill-conditioned matrix problems and present the corresponding primal results. Then 450 large-scale unconstrained problems are referred. Finally, we give an accelerated strategy for the proposed algorithm and apply it to some image restoration problems. Numerical results indicate that the algorithm is reliable and much more efficient and effective than the other methods for the test problems.