Abstract

We propose the stochastic simplex bisection algorithm. It randomly selects one from a set of simplexes, bisects it, and replaces it with its two offspring. The selection probability is proportional to a score indicating how promising the simplex is to bisect. We generalize intervals to simplexes, rather than to hyperboxes, as bisection then only requires evaluating the function in one new point, which is somewhat randomized. Using a set of simplexes that partition the search space yields completeness and avoids redundancy. We provide an appropriate scale- and offset-invariant score definition and add an outer loop for handling hyperboxes. Experiments show that the algorithm is capable of exploring vast numbers of local optima, over huge ranges, yet finding the global one. The ease with which it handles quadratic functions makes it ideal for non-linear regression: it is here successfully applied to logistic regression. The algorithm does well, also when the number of function evaluations is severely restricted.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call