Abstract

In this paper, a new global optimization method is proposed for an optimization problem with twice-differentiable objective and constraint functions of a single variable. The method employs a difference of convex underestimator and a convex cut function, where the former is a continuous piecewise concave quadratic function, and the latter is a convex quadratic function. The main objectives of this research are to determine a quadratic concave underestimator that does not need an iterative local optimizer to determine the lower bounding value of the objective function and to determine a convex cut function that effectively detects infeasible regions for nonconvex constraints. The proposed method is proven to have a finite ?-convergence to locate the global optimum point. The numerical experiments indicate that the proposed method competes with another covering method, the index branch-and-bound algorithm, which uses the Lipschitz constant.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.