Abstract

The steepest descent (SD) method is well-known as the simplest method in optimization. In this paper, we propose a new SD search direction for solving unconstrained optimization problems. We also prove that the method is globally convergent with exact line search for general objectives functions. This proposed method is motivated by a previous work on SD method by Zubai’ah-Mustafa-Rivaie-Ismail (ZMRI). A comparison was performed with the other SD method and also with the conjugate gradient (CG) method using the conjugate coefficient, Fletcher-Reeves (FR) and Rivaie-Mustafa-Ismail-Leong (RMIL). Based on the numerical results, this new search direction show that this method has substantially outperformed the previous SD method in term of number of iterations and central processing unit (CPU) time for the given standard test problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call