Abstract

It is attempted to present two derivative-free Steffensen-type methods with memory for solving nonlinear equations. By making use of a suitable self-accelerator parameter in the existing optimal fourth- and eighth-order without memory methods, the order of convergence has been increased without any extra function evaluation. Therefore, its efficiency index is also increased, which is the main contribution of this paper. The self-accelerator parameters are estimated using Newton’s interpolation. To show applicability of the proposed methods, some numerical illustrations are presented.

Highlights

  • Finding the root of a nonlinear equation frequently occurs in scientific computation

  • We are concerned with the extension of the above schemes in methods with memory, since its error equations contain the parameter which can be approximated in such a way that it increases the local order convergence

  • The efficiency of the existing method has been improved by employing information from the current and previous iteration without any additional evaluation of the function

Read more

Summary

Introduction

Finding the root of a nonlinear equation frequently occurs in scientific computation. They discussed two general n-step methods based on interpolation They conjectured that any multistep method without memory using n function evaluations may reach the convergence order at most 2n−1 [9]. Both Newton’s and Steffensen’s methods are optimal in the sense of Kung and Traub. To improve convergence order as well as efficiency index without any new function evaluations, Traub in his book introduced the method with memory. He changed Steffensen’s method slightly as follows

Development and Construction with Memory
Method
Application to Nonlinear Equations
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call