Abstract
We consider the multi-valued problem of nding all solutions of the equation f(x) = 0 in the space of functions f : [0, 1] → R such that the derivative f(r) with r ∈ {0, 1, 2, . . .} exists and is Hölder continuous with exponent ϱ ∈ (0, 1]. Available algorithms use information about values of f and/or its derivatives at adaptively selected n points, and the error between the true solution Z(f) and approximate solution Zn(f) is measured by the Hausdor distance dH(Z(f),Zn(f)). We show that, despite the fact that the worst case error of any algorithm is innite, it is possible to construct nonadaptive approximations Z∗n such that the error dH (Z(f),Z∗n(f)) converges to zero as n → +∞. However, the convergence can be arbitrarily slow. Specically, for arbitrary sequence of approximations {Zn}n≥1 that use n adaptively chosen function values and/or its derivatives, and for arbitrary positive sequence {τn}n≥1 converging to zero there are functions f in our space such that supn≥1 τ−1n dH(Z(f),Zn(f))= +∞. We conjecture that the same lower bound holds if we allow information about values of n arbitrary and adaptively selected linear functionals at f.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.