This paper develops a line-search algorithm that uses objective function models with tunable accuracy to solve smooth optimization problems with convex constraints. The evaluation of objective function and its gradient is potentially computationally expensive, but it is assumed that one can construct effective, computationally inexpensive models. This paper specifies how these models can be used to generate new iterates. At each iteration, the model has to satisfy function error and relative gradient error tolerances determined by the algorithm based on its progress. Moreover, a bound for the model error is used to explore regions where the model is sufficiently accurate. The algorithm has the same first-order global convergence properties as standard line-search methods, but only uses the models and the model error bounds. The algorithm is applied to problems where the evaluation of the objective requires the solution of a large-scale system of nonlinear equations. The models are constructed from reduced order models of this system. Numerical results for partial differential equation constrained optimization problems show the benefits of the proposed algorithm.
Read full abstract