Abstract

The modelling of forward initial margin poses a challenging problem, as it requires the implementation of a nested Monte Carlo simulation, which is computationally intractable. Abundant literature has been published on approximation methods aiming to reduce the dimensionality of the problem, the most popular ones being the family of regression methods. This article describes the mathematical foundations on which these regression approximation methods lie. Mathematical rigor is introduced to show that, in essence, all methods are performing orthogonal projections on Hilbert spaces, while simply choosing a different functional form to numerically estimate the conditional expectation. The most popular methods in the literature so far are covered here. These are polynomial approximations, kernel regressions, and neural networks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call