Abstract

The Nelder-Mead (NM) method is a popular derivative-free optimization algorithm owing to its fast convergence and robustness. However, it is known that the method often fails to converge or costs a long time for a large-scale optimization. In the present study, the NM method has been improved using direct inversion in iterative subspace (DIIS). DIIS is a technique to accelerate an optimization method, extrapolating a better intermediate solution from linear-combination of the known ones. We compared runtimes of the new method (NM-DIIS) and the conventional NM method using unimodal test functions with various dimensions. The NM-DIIS method showed better results than the original NM on average when the dimension of the objective function is high. Long tails of the runtime distributions in the NM method have disappeared when DIIS was applied. DIIS has also been implemented in the quasi-gradient method, which is an improved version of the NM method developed by Pham et al. [IEEE Trans. Ind. Informatics, 7 (2011) 592]. The combined method also performed well especially in an upwardly convex test function. The present study proposes a practical optimization strategy and proves the versatility of DIIS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call