Abstract

A number of higher order iterative methods with derivative evaluations are developed in literature for computing multiple zeros. However, higher order methods without derivative for multiple zeros are difficult to obtain and hence such methods are rare in literature. Motivated by this fact, we present a family of eighth order derivative-free methods for computing multiple zeros. Per iteration the methods require only four function evaluations, therefore, these are optimal in the sense of Kung-Traub conjecture. Stability of the proposed class is demonstrated by means of using a graphical tool, namely, basins of attraction. Boundaries of the basins are fractal like shapes through which basins are symmetric. Applicability of the methods is demonstrated on different nonlinear functions which illustrates the efficient convergence behavior. Comparison of the numerical results shows that the new derivative-free methods are good competitors to the existing optimal eighth-order techniques which require derivative evaluations.

Highlights

  • Approximating a root of a function is a very challenging task

  • We introduce a family of eighth order derivative-free methods for computing multiple zeros that require the evaluations of four functions per iteration, and the family has optimal convergence of eighth order in the sense of

  • Performance is compared with some existing eighth-order methods requiring derivative evaluations in their formulae

Read more

Summary

Introduction

Approximating a root (say, α) of a function is a very challenging task. It is very important in many diverse areas such as Mathematical Biology, Physics, Chemistry, Economics and Engineering to mention a few [1,2,3,4]. The derivative-free methods are important in the situations where derivative of the function f is complicated to evaluate or is expensive to obtain One such derivative-free method is the classical Traub-Steffensen method [1] which replaces the derivative f 0 in the classical Newton’s method by a suitable approximation based on finite difference quotient, f ( xk + β f ( xk )) − f ( xk ). The modified Traub-Steffensen method (2) is a noticeable improvement of Newton’s iteration, since it preserves the order of convergence without using any derivative. The techniques of [25] require four function evaluations per iteration and, according to Kung-Traub hypothesis these do not possess optimal convergence [26]. We introduce a family of eighth order derivative-free methods for computing multiple zeros that require the evaluations of four functions per iteration, and the family has optimal convergence of eighth order in the sense of.

Development of Method
Complex Dynamics of Methods
Numerical Results
Methods
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call