Abstract

AbstractThe method of multipliers1–3 (MOM) is a transformation technique which has enjoyed considerable popularity in recent years. The algorithmic philosophy is similar to conventional penalty function methods in that a constrained nonlinear programming problem is transformed into a sequence of unconstrained problems. In the standard MOM approach, the multipliers are updated after each unconstrained search. In this paper we investigate methods which involve continuous updating of the penalty parameters and design variables. We demonstrate that this continuous updating scheme is equivalent to the generalized reduced gradient method4,5 applied to a certain dual problem. Computational results are given which suggest that the continuous updating MOM is not as efficient as one might reasonably hope.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call