Background:
 A new optimization algorithm is presented. The method is biased with new non monotone line search an accelerated three term conjugate gradient method damped of Quasi Newton method compared to previews method design efficiency in provident of more than one factor for different optimization problem are more dramatic due to the ability of the technique to utile existing data.
 Materials and Methods:
 New monotone line search, new monotone line search, new modification of Damped Quasi-Newton method, Motivation and New Quasi-Newton Algorithm (MQ) and Global convergence.
 Results:
 In this work, we have n tendency to compare our new algorithm with same classical strategies like [7] by exploiting of unconstrained nonlinear optimization problem the functions obtained from Andrei [5, 6] Waziri and Sabiu (2015)[10] and La couzetul (2004)[3]. The numerical experiments demonstrate the performance of the proposed method. We selected seven relatively unconstrained problems with the size varies from 10 to 100. We consider the three sizes of each problem so that the total number of problem is 21 test problems. We stop the iteration when is satisfied All codes were written in Matlab R2017a and run on a pc with Intel COREi4 with a processor with 4GB of Ram and CPU 2.3GHZ we solved test problems using two different initial starting points.
 Conclusion:
 In this research article, a project on an accelerated three-term efficient algorithm for numerical optimization has presented the method as completely a derivative-free algorithm with less NOI and NOF and CPU time computed to the existing methods .using classical assumption the global convergence was also proved. Numerical results using the three terms efficient algorithm show that the algorithm is promising.
Read full abstract