Abstract

We analyze approximation rates by deep ReLU networks of a class of multivariate solutions of Kolmogorov equations which arise in option pricing. Key technical devices are deep ReLU architectures capable of efficiently approximating tensor products. Combining this with results concerning the approximation of well-behaved (i.e., fulfilling some smoothness properties) univariate functions, this provides insights into rates of deep ReLU approximation of multivariate functions with tensor structures. We apply this in particular to the model problem given by the price of a European maximum option on a basket of d assets within the Black–Scholes model for European maximum option pricing. We prove that the solution to the d-variate option pricing problem can be approximated up to an varepsilon -error by a deep ReLU network with depth {mathcal {O}}big (ln (d)ln (varepsilon ^{-1})+ln (d)^2big ) and {mathcal {O}}big (d^{2+frac{1}{n}}varepsilon ^{-frac{1}{n}}big ) nonzero weights, where nin {mathbb {N}} is arbitrary (with the constant implied in {mathcal {O}}(cdot ) depending on n). The techniques developed in the constructive proof are of independent interest in the analysis of the expressive power of deep neural networks for solution manifolds of PDEs in high dimension.

Highlights

  • 1.1 MotivationThe development of new classification and regression algorithms based on deep neural networks—coined “deep learning”—revolutionized the area of artificial intelligence, machine learning and data analysis [15]

  • As a particular application of the tools developed in the present paper, we provide a mathematical analysis of the rates of expressive power of neural networks for a particular, high-dimensional PDE which arises in mathematical finance, namely the pricing of a so-called European maximum option

  • While we admit that the European maximum option pricing problem for uncorrelated assets constitutes a rather special problem, the proofs in this paper develop several novel deep neural network approximation results of independent interest that can be applied to more general settings where a low-rank structure is implicit in highdimensional problems

Read more

Summary

Motivation

The development of new classification and regression algorithms based on deep neural networks—coined “deep learning”—revolutionized the area of artificial intelligence, machine learning and data analysis [15] These methods have been applied to the numerical solution of partial differential equations (PDEs for short) [3,12,21,22,27,32,39,41,42]. It is shown that deep neural networks provide optimal approximation rates for classical smoothness spaces such as Sobolev spaces or Besov spaces These results have been extended to Shearlet and Ridgelet spaces [5], modulation spaces [33], piecewise smooth functions [34] and polynomial chaos expansions [38]. All these results indicate that all classical approximation methods based on sparse expansions can be emulated by neural networks

Contributions and Main Result
Outline
High-Dimensional Derivative Pricing
Regularity of the Cumulative Normal Distribution
Quadrature
Basic ReLU DNN Calculus
Basic Expression Rate Results
DNN Expression Rates for High-Dimensional Basket Prices
Discussion
Findings
A Additional Proofs
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call