Consider the multidimensional unconstrained minimization problem in a case of continuously differentiable function. An iterative algorithm for solving such a problem is called a multi-term if in order to find the next approximation to the optimal point we need to compute values of the function or its gradient in two or more previous points. So that, conjugate gradient algorithm is a two-term algorithm. The aim of this paper is to study a generalized $p$-term method for unconstrained optimization. One substantiates the properties of this algorithm for quadratic functions and proves that it relates to conjugate direction methods. The goal of computational experiment was to compare the results of minimization with different number of terms $p$ and find the ``optimal'' value for the $p$. The numerical results for some well-known test functions are given.