Abstract

In molecular dynamics (MD), neural network (NN) potentials trained bottom-up on quantum mechanical data have seen tremendous success recently. Top-down approaches that learn NN potentials directly from experimental data have received less attention, typically facing numerical and computational challenges when backpropagating through MD simulations. We present the Differentiable Trajectory Reweighting (DiffTRe) method, which bypasses differentiation through the MD simulation for time-independent observables. Leveraging thermodynamic perturbation theory, we avoid exploding gradients and achieve around 2 orders of magnitude speed-up in gradient computation for top-down learning. We show effectiveness of DiffTRe in learning NN potentials for an atomistic model of diamond and a coarse-grained model of water based on diverse experimental observables including thermodynamic, structural and mechanical properties. Importantly, DiffTRe also generalizes bottom-up structural coarse-graining methods such as iterative Boltzmann inversion to arbitrary potentials. The presented method constitutes an important milestone towards enriching NN potentials with experimental data, particularly when accurate bottom-up data is unavailable.

Highlights

  • In molecular dynamics (MD), neural network (NN) potentials trained bottom-up on quantum mechanical data have seen tremendous success recently

  • Within the data set distribution, state-of-the-art NN potentials have already reached the accuracy limit imposed by density functional theory (DFT), with the test error in predicting potential energy being around two orders of magnitude smaller than the corresponding expected DFT accuracy[11,18]

  • Addressing the call for NN potentials trained on experimental data[1], we propose the Differentiable Trajectory Reweighting (DiffTRe) method

Read more

Summary

Introduction

In molecular dynamics (MD), neural network (NN) potentials trained bottom-up on quantum mechanical data have seen tremendous success recently. We show effectiveness of DiffTRe in learning NN potentials for an atomistic model of diamond and a coarse-grained model of water based on diverse experimental observables including thermodynamic, structural and mechanical properties. Experimental observables are linked only indirectly to the potential model via an expensive molecular mechanics simulation, complicating optimization. In the limit of a sufficiently large data set without a distribution shift[19,20] with respect to the application domain (potentially generated via active learning approaches21), remaining deviations of predicted observables from experiments are attributable to uncertainty in DFT simulations11—in line with literature reporting DFT being sensitive to employed functionals[22]. More precise computational quantum mechanics models, e.g., the coupled cluster CCSD(T) method, improve DFT accuracy at the expense of significantly increased computational effort for data set generation[23,24]. The main obstacle in bottom-up learning of NN potentials is the currently limited availability of highly precise and sufficiently broad data sets

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.