Abstract

Adaptive query algorithms for finding the minimum of a function f are studied. The algorithms build on the earlier adaptive algorithms given in H,Pe 7]. The rate of convergence of these algorithms is estimated under various model assumptions on the function f. The first class of algorithms is analyzed when f satisfies a smoothness condition, e.g. f ∈ C r , and an assumption on its level sets as given in Pe [7]. There is a distinction drawn as to whether or not the algorithm has knowledge of the semi-norm |f|Cr. If this information is known, it is rather straightforward to design algorithms with optimal performance and to show that this performance is better than non-adaptive algorithms. A bit more subtle is to build algorithms which are universal in that they do not need to know the semi-norm of f. Universal algorithms are built that have the same asymptotic performance as when the semi-norm is known, save for a logarithm. The second part of this paper studies adaptive algorithms for finding the minimum of a function in high dimension. In this case, additional assumptions are placed on f, of the form given in DPW [3], that have the effect of variable reduction and thereby avoiding the curse of dimensionality. These algorithms are again shown to be asymptotically optimal up to a logarithm factor.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call