Derivative-free optimization tackles problems, where the derivatives of the objective function are unknown. However, in practical optimization problems, the derivatives of the objective function are often not available with respect to all optimization variables, but for some. In this work we propose the Hermite least squares optimization method: an optimization method, specialized for the case that some partial derivatives of the objective function are available and others are not. The main goal is to reduce the number of objective function calls compared to state of the art derivative-free solvers, while the convergence properties are maintained. The Hermite least squares method is a modification of Powell’s derivative-free BOBYQA algorithm. But instead of (underdetermined) interpolation for building the quadratic subproblem in each iteration, the training data is enriched with first and—if possible—second order derivatives and then least squares regression is used. Proofs for global convergence are discussed and numerical results are presented. Further, the applicability is verified for a realistic test case in the context of yield optimization. Numerical tests show that the Hermite least squares approach outperforms classic BOBYQA if half or more partial derivatives are available. In addition, it achieves more robustness and thus better performance in case of noisy objective functions.
Read full abstract