Abstract

The purpose of this paper is twofold: first, to introduce deterministic strategies for directional direct-search methods, including new instances of the mesh adaptive direct-search (MADS) and the generating set search (GSS) class of algorithms, which utilize a nice distribution of PoLLdirections when compared to other strategies, and second, to introduce variants of each algorithm which utilize a minimal positive basis at each step. The strategies base their PoLLdirections on the use of the QR decomposition to obtain an orthogonal set of directions or on using the equal angular directions from a regular simplex centered at the origin with vertices on the unit sphere. Test results are presented on a set of smooth, nonsmooth, unconstrained, and constrained problems that give comparisons between the various implementations of these directional direct-search methods.

Highlights

  • In [1], Vicente and Custodio introduced a framework for directional direct-search methods (DSM) encompassing both the mesh adaptive direct-search (Mads) [2] class of algorithms and the generating set search (Gss) [3] class of algorithms for black-box optimization problems of the form minf (x), x∈Ω (1)where the objective function f : Ω ⊆ Rn → R ⋃{∞} is nonsmooth, possibly discontinuous and Ω is the set of feasible points

  • This paper introduces instances of Mads and Gss, called EadMads and EadGss, that have versions using both a maximal positive basis (2n polling directions) and a minimal positive basis (n + 1 polling directions)

  • This paper has outlined strategies for making both the Mads and Gss algorithms deterministic, ensuring that the end results are repeatable, and outlined a method for using a set of n + 1 directions generated from a regular simplex

Read more

Summary

Introduction

In [1], Vicente and Custodio introduced a framework for directional direct-search methods (DSM) encompassing both the Mads [2] class of algorithms (using an underlying mesh and simple decrease) and the Gss [3] class of algorithms (using sufficient decrease under a forcing function ρ) for black-box optimization problems of the form minf (x) , x∈Ω (1). One outcome of this paper is to introduce a deterministic version of QrMads by deterministically generating a matrix with full rank It has been observed, in some cases, that reducing the number of function evaluations at every iteration from a maximal positive spanning set to a minimal positive spanning set can improve the performance of these types of algorithms (see, e.g., [12,13,14]). In some cases, that reducing the number of function evaluations at every iteration from a maximal positive spanning set to a minimal positive spanning set can improve the performance of these types of algorithms (see, e.g., [12,13,14]) For this reason, this paper introduces instances of Mads and Gss, called EadMads and EadGss, that have versions using both a maximal positive basis (2n polling directions) and a minimal positive basis (n + 1 polling directions).

Constructing a Basis
Rounding a Basis to the Mesh
The EadMads Instance of Mads
The EadGss Instance of a Directional Direct-Search Method
Numerical Tests
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call