Abstract

In this paper, we present the Wolfe’s reduced gradient method for multiobjective (multicriteria) optimization. We precisely deal with the problem of minimizing nonlinear objectives under linear constraints and propose a reduced Jacobian method, namely a reduced gradient-like method that does not scalarize those programs. As long as there are nondominated solutions, the principle is to determine a direction that decreases all goals at the same time to achieve one of them. Following the reduction strategy, only a reduced search direction is to be found. We show that this latter can be obtained by solving a simple differentiable and convex program at each iteration. Moreover, this method is conceived to recover both the discontinuous and continuous schemes of Wolfe for the single-objective programs. The resulting algorithm is proved to be (globally) convergent to a Pareto KKT-stationary (Pareto critical) point under classical hypotheses and a multiobjective Armijo line search condition. Finally, experiment results over test problems show a net performance of the proposed algorithm and its superiority against a classical scalarization approach, both in the quality of the approximated Pareto front and in the computational effort.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.