Abstract

We provide a local convergence analysis for Newton’s method under mild differentiability conditions on the operator involved in a Banach space setting. In particular we show that under the same hypotheses and computational cost but using more precise estimates we can provide a larger convergence radius and finer error bounds on the distances involved than before (Huang in Comput. Math. Math. 42:247–251, 2004; Rheinboldt in Banach Ctr. Publ. 3:129–142, 1977; Wang in IMA J. Numer. Anal. 20:123–134, 2000). Some numerical examples are used to further justify the usage of our results over the earlier ones mentioned above.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call