Abstract

When applied to stiff problems, the effective order of convergence of general linear methods is governed by their stage order, which is less than or equal to the classical order of the method. This produces an order reduction phenomenon, present in all general linear methods except those with high stage order, in a manner similar to that observed in other time integrators with internal stages.In this paper, we investigate the order reduction which arises when general linear methods are used as time integrators when using the method of lines for solving numerically initial boundary value problems with time dependent boundary values.We propose a technique, based on making an appropriate choice of the boundary values for the internal stages, with which it is possible to recover one unit of order, as we prove in this work. As expected, this implies a considerable improvement for the general linear methods suffering order reduction. Moreover, numerical experiments show that the improvement is not only in these cases, but that, even when the order reduction is not expected, the size of the errors is drastically reduced by using the technique proposed in this paper.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call