Abstract

The understanding of adaptive algorithms for stochastic differential equations (SDEs) is an open area, where many issues related to both convergence and stability (long-time behaviour) of algorithms are unresolved. This paper considers a very simple adaptive algorithm, based on controlling only the drift component of a time step. Both convergence and stability are studied. The primary issue in the convergence analysis is that the adaptive method does not necessarily drive the time steps to zero with the user-input tolerance. This possibility must be quantified and shown to have low probability. The primary issue in the stability analysis is ergodicity. It is assumed that the noise is nondegenerate, so that the diffusion process is elliptic, and the drift is assumed to satisfy a coercivity condition. The SDE is then geometrically ergodic (averages converge to statistical equilibrium exponentially quickly). If the drift is not linearly bounded, then explicit fixed time step approximations, such as the Euler–Maruyama scheme, may fail to be ergodic. In this work, it is shown that the simple adaptive time-stepping strategy cures this problem. In addition to proving ergodicity, an exponential moment bound is also proved, generalizing a result known to hold for the SDE itself.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call