Abstract

In this paper we prove that rectified deep neural networks do not suffer from the curse of dimensionality when approximating McKean–Vlasov SDEs in the sense that the number of parameters in the deep neural networks only grows polynomially in the space dimension d of the SDE and the reciprocal of the accuracy ϵ.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call