Abstract Relying on the classical connection between backward stochastic differential equations and nonlinear parabolic partial differential equations (PDEs), we propose a new probabilistic learning scheme for solving high-dimensional semilinear parabolic PDEs. This scheme is inspired by the approach coming from machine learning and developed using deep neural networks in Han et al. (2018, Solving high-dimensional partial differential equations using deep learning. Proc. Natl. Acad. Sci., 115, 8505–8510. Our algorithm is based on a Picard iteration scheme in which a sequence of linear-quadratic optimization problem is solved by means of stochastic gradient descent algorithm. In the framework of a linear specification of the approximation space, we manage to prove a convergence result for our scheme, under some smallness condition. In practice, in order to be able to treat high-dimensional examples, we employ sparse-grid approximation spaces. In the case of periodic coefficients and using pre-wavelet basis functions, we obtain an upper bound on the global complexity of our method. It shows, in particular, that the curse of dimensionality is tamed in the sense that in order to achieve a root mean squared error of order $\varepsilon $, for a prescribed precision $\varepsilon $, the complexity of the Picard algorithm grows polynomially in $\varepsilon ^{-1}$ up to some logarithmic factor $|\!\log (\varepsilon )|$, whose exponent grows linearly with respect to the PDE dimension. Various numerical results are presented to validate the performance of our method, and to compare them with some recent machine learning schemes proposed in E et al. (2017, Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations. Commun. Math. Stat., 5, 349–380) and Huré et al. (2020, Deep backward schemes for high-dimensional nonlinear PDEs. Math. Comput., 89, 1547–1579).
Read full abstract