Abstract

Molecular-orbital-based machine learning (MOB-ML) provides a general framework for the prediction of accurate correlation energies at the cost of obtaining molecular orbitals. The application of Nesbet's theorem makes it possible to recast a typical extrapolation task, training on correlation energies for small molecules and predicting correlation energies for large molecules, into an interpolation task based on the properties of orbital pairs. We demonstrate the importance of preserving physical constraints, including invariance conditions and size consistency, when generating the input for the machine learning model. Numerical improvements are demonstrated for different datasets covering total and relative energies for thermally accessible organic and transition-metal containing molecules, non-covalent interactions, and transition-state energies. MOB-ML requires training data from only 1% of the QM7b-T dataset (i.e., only 70 organic molecules with seven and fewer heavy atoms) to predict the total energy of the remaining 99% of this dataset with sub-kcal/mol accuracy. This MOB-ML model is significantly more accurate than other methods when transferred to a dataset comprising of 13 heavy atom molecules, exhibiting no loss of accuracy on a size intensive (i.e., per-electron) basis. It is shown that MOB-ML also works well for extrapolating to transition-state structures, predicting the barrier region for malonaldehyde intramolecular proton-transfer to within 0.35 kcal/mol when only trained on reactant/product-like structures. Finally, the use of the Gaussian process variance enables an active learning strategy for extending the MOB-ML model to new regions of chemical space with minimal effort. We demonstrate this active learning strategy by extending a QM7b-T model to describe non-covalent interactions in the protein backbone-backbone interaction dataset to an accuracy of 0.28 kcal/mol.

Highlights

  • The calculation of accurate potential energies of molecules and materials at affordable cost is at the heart of computational chemistry

  • We probe these effects on relative- and total-energy predictions for organic and transition-metal containing molecules, and we investigate the applicability of Molecular-orbital-based machine learning (MOB-ML) to transition-state structures and non-covalent interactions

  • This transferability test was repeated with 10000 different training data sets to assess the training set dependence of the MOB-ML models

Read more

Summary

Introduction

The calculation of accurate potential energies of molecules and materials at affordable cost is at the heart of computational chemistry. While state-of-the-art ab initio electronic structure theories can yield highly accurate results, they are computationally too expensive for routine applications. Density functional theory (DFT) is computationally cheaper and has enjoyed widespread applicability. We show how changes to the feature design affect the performance and transferability of MOB-ML models within the same molecular family (Section IV A) and across molecular families (Sections IV B-IV C). We probe these effects on relative- and total-energy predictions for organic and transition-metal containing molecules, and we investigate the applicability of MOB-ML to transition-state structures and non-covalent interactions

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call