Abstract
Random feature neural network approximations of the potential in Hamiltonian systems yield approximations of molecular dynamics correlation observables that have the expected error OK-1+J-1212, for networks with K nodes using J data points, provided the Hessians of the potential and the observables are bounded. The loss function is based on the least squares error of the potential and regularizations, with the data points sampled from the Gibbs density. The proof uses a new derivation of the generalization error for random feature networks that does not apply the Rademacher or related complexities.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have