Abstract

This article revisits the Diffusion Operator Integral (DOI) variance reduction technique originally proposed in Heath and Platen (2002) and extends its theoretical concept to the pricing of American-style options under (time-homogeneous) Lévy stochastic differential equations. The resulting Jump Diffusion Operator Integral (JDOI) method can be combined with numerous Monte Carlo-based stopping-time algorithms, including the ubiquitous least-squares Monte Carlo (LSMC) algorithm of Longstaff and Schwartz (cf. Carriere (1996) and Longstaff and Schwartz (2001)). We exemplify the usefulness of our theoretical derivations under a concrete, though very general jump-diffusion stochastic volatility dynamics and test the resulting LSMC-based version of the JDOI method. The results provide evidence of a strong variance reduction when compared with a simple application of the LSMC algorithm and proves that applying our technique on top of Monte Carlo-based pricing schemes provides a powerful way to speed-up these methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call