Abstract
The Fejer and ClenshawCurtis rules for numerical integration exhibit a curious phenomenon when applied to certain analytic functions. When N (the number of points in the integration rule) increases, the error does not decay to zero evenly but does so in two distinct stages. For N less than a critical value, the error behaves like O(ρ-2N), where ρ is a constant greater than 1. For these values of N the accuracy of both the Fejer and ClenshawCurtis rules is almost indistinguishable from that of the more celebrated GaussLegendre quadrature rule. For larger N, however, the error decreases at the rate O(ρ-N), i.e., only half as fast as before. Convergence curves typically display a kink where the convergence rate cuts in half. In this paper we derive explicit as well as asymptotic error formulas that provide a complete description of this phenomenon.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.