Abstract
In this paper, efforts are made to solve nonlinear programming with many complicated constraints more efficiently. The constrained optimization problem is firstly converted to a minimax problem, where the max-value function is approximately smoothed by the so-called flattened aggregate function or its modified version. For carefully updated aggregate parameters, the smooth unconstrained optimization problem is solved by an inexact Newton method. Because the flattened aggregate function can usually reduce greatly the amount of computation for gradients and Hessians, the method is more efficient. Convergence of the proposed method is proven and some numerical results are given to show its efficiency.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.