Abstract
In this paper, we present a method for the accurate estimation of the derivative (aka. sensitivity) of expectations of functions involving an indicator function by combining a stochastic algorithmic differentiation and a regression. The method is an improvement of the approach presented in [Risk Magazine April 2018]. The finite difference approximation of a partial derivative of a Monte-Carlo integral of a discontinuous function is known to exhibit a high Monte-Carlo error. The issue is evident since the Monte-Carlo approximation of a discontinuous function is just a finite sum of discontinuous functions and as such, not even differentiable. The algorithmic differentiation of a discontinuous function is problematic. A natural approach is to replace the discontinuity by continuous functions. This is equivalent to replacing a path-wise automatic differentiation by a (local) finite difference approximation. We present an improvement (in terms of variance reduction) by decoupling the integration of the Dirac delta and the remaining conditional expectation and estimating the two parts by separate regressions. For the algorithmic differentiation, we derive an operator that can be injected seamlessly - with minimal code changes - into the algorithm resulting in the exact result.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.