Abstract

In general one can say that line search procedure for the steplength and search direction are two important elements of a line search algorithm. The line search procedure requires much attention because of its far implications on the robustness and efficiency of the algorithm. The purpose of this paper is to propose a simple yet effective line search strategy in solving unconstrained convex optimization problems. This line search procedure does not require the evaluation of the objective function. Instead, it forces reduction in gradient norm on each direction. Hence it is suitable for problems when function evaluation is very costly. To illustrate the effectiveness of our line search procedure, we employ this procedure together with the symmetric rank one quasi-Newton update and test it against the same quasi-Newton method with the well-known Armijo line search. Numerical results on a set of standard unconstrained optimization problems showed that the proposed procedure is superior to the Armijo line search.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call