Abstract

The presentation illustrates the theory and application of a Stochastic Automatic Differentiation (Stochastic Algorithmic Differentiation). The Stochastic Automatic Differentiation allows to improve performance and reduce the memory requirements of an AD algorithm by exploiting the stochastic nature of the random variables. Considering an Expected Stochastic Automatic Differentiation we can give a modified AD algorithm which allows efficient adjoint (backward) automatic differentiation of evaluations containing non-pathwise operators like the conditional expectation operator. The algorithm can be applied to calculate forward sensitivities, i.e.~differentiation with respect to stochastic intermediate (future) values. This can be used for the efficient calculation of MVAs in finance. The implementation can be performed using only few additional lines of code. We provide source code and numerical examples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call