Abstract

Numerical analysts and scientists working in applications often observe that once they improve their techniques to get a better accuracy, some instability of the evaluation creeps in through the back door. This paper shows for a large class of numerical methods that such a Trade-off Principle between error and evaluation stability is unavoidable. It is an instance of a no free lunch theorem. Here, evaluation is the mathematical map that takes input data to output data. This is independent from the numerical routine that calculates the output. Therefore, evaluation stability is different from computational stability. The setting is confined to recovery of functions from data, but it includes solving differential equations by writing such methods as a recovery of functions under constraints imposed by differential operators and boundary values. The trade-off principle bounds the product of two terms from below. The first is related to errors, and the second turns out to be related to evaluation instability. Under certain conditions satisfied for splines and kernel-based interpolation, both can be minimized. Then the lower bound is attained, and the error term is the inverse of the instability term. As a byproduct, it is shown that Kansa’s Unsymmetric Collocation Method sacrifices accuracy for improved evaluation stability, when compared to symmetric collocation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call