Abstract

This paper compares the accuracy of unconstrained gradient search using substitute derivatives approximated by finite differences or first-order response surfaces. The finite difference approximation employed was a forward difference with variable step size. Two experimental designs were used to estimate a first-order response surface. They were an augmented factorial and a simplex design. Three gradient search procedures were studied. They were a quasi-Newton algorithm, a conjugate gradient algorithm and a steepest descent algorithm. The results of this study demonstrate that a finite difference approximation will in general outperform a first-order response surface approximation. Further, both the quasi-Newton algorithm with finite difference approximation and the conjugate gradient algorithm with finite difference approximation provide a viable alternative to search procedures based on function comparisons.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call