Abstract

In the analysis of inverse problems in electromagnetics, the object function that is defined not only is a minimum at the optimum, but also zero. As a result, the Newton algorithm which reaches the neighborhood of the optimum rather quickly, degenerates because of the division of a zero by a zero. This paper takes up two issues. First, we investigate the use of an alternative function that gives the optimum not by a minimum but when it is zero. This allows the Newton algorithm to be used without restriction. Second, we take up the case where the usual function with zero minimum is used, but the second derivative (the Hessian) is nonsingular; now the design parameters are sought using the second derivative of the object function when the computation by the first method becomes sluggish. It is further shown that the second derivative of the object function is naturally computable from the finite-element trial functions and the solution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call