Abstract

Interval Newton methods in conjunction with generalized bisection can form the basis of algorithms that find all real roots within a specified box ${\bf X} \subset {\bf R}^n $ of a system of nonlinear equations $F(X) = 0$with mathematical certainty, even in finite-precision arithmetic. In such methods, the system $F(X) = 0$ is transformed into a linear interval system $0 = F(M) + {\bf F'}({\bf X})({\bf \bar X} - M)$ ; if interval arithmetic is then used to bound the solutions of this system, the resulting box ${{\bf \bar X}}$ contains all roots of the nonlinear system. The interval Gauss–Seidel method is a reasonable way of finding such solution bounds. For the overall interval Newton/bisection algorithm to be efficient, the image box ${{\bf \bar X}}$ should be as small as possible. To do this, the linear interval system is multiplied by a preconditioner matrix Y before the interval Gauss–Seidel method is applied. In this paper, a technique for computing such preconditioner matrices Y is described. This technique involves optimality conditions expressible as linear programming problems. In many instances, the resulting preconditioners give an ${{\bf \bar X}}$ of minimal width. They can also be applied when ${{\bf F'}}$ approximates a singular matrix, and the optimality conditions can be altered to describe preconditioners with a given structure. This technique is illustrated with some simple examples and with numerical experiments. These experiments indicate that the new preconditioner results in significantly less function and Jacobian evaluations, especially for ill-conditioned problems, but it requires more computation to obtain.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call