Abstract

Nelder–Mead simplex method (NM), originally developed in deterministic optimization, is an efficient direct search method that optimizes the response function merely by comparing function values. While successful in deterministic settings, the application of NM to simulation optimization suffers from two problems: (1) It lacks an effective sample size scheme for controlling noise; consequently the algorithm can be misled to the wrong direction because of noise, and (2) it is a heuristic algorithm; the quality of estimated optimal solution cannot be quantified. We propose a new variant, called Stochastic Nelder–Mead simplex method (SNM), that employs an effective sample size scheme and a specially-designed global and local search framework to address these two problems. Without the use of gradient information, SNM can handle problems where the response functions are nonsmooth or gradient does not exist. This is complementary to the existing gradient-based approaches. We prove that SNM can converge to the true global optima with probability one. An extensive numerical study also shows that the performance SNM is promising and is worthy of further investigation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.