In this paper the problem of line search, an important step in most multidimensional optimization algorithms, is considered. If conjugate gradient descent is to be used in producing a descent sequence for a given functional f on a Hilbert space H, then at every step in the sequence one is faced with the problem of finding α, the step length to minimize f(x (k) + αP (k)) where x (k), P (k) ∈ H are the k-th elements of the descent sequence and k-th descent direction, respectively. This is usually accomplished by a one-dimensional search. It is the purpose of this paper to discuss a method of search that determines, by a synergy of analytic-synthetic procedures, a sequence {α (k)} such that the sequence {f(x (k) + α (k) P (k))} converges to f(x *(k)), the minimum of f(x). Specifically, the step in Fletcher-Reeve's algorithm that employs the geometric mean of the past values of α (k) as initial estimate is replaced by their harmonic mean to yield initially a low order accuracy formula. The efficiency of this technique is confirmed by numerical experimentation.
Read full abstract