Abstract
In recent years the development of machine learning potentials (MLPs) has become a very active field of research. Numerous approaches have been proposed, which allow one to perform extended simulations of large systems at a small fraction of the computational costs of electronic structure calculations. The key to the success of modern MLPs is the close-to first principles quality description of the atomic interactions. This accuracy is reached by using very flexible functional forms in combination with high-level reference data from electronic structure calculations. These data sets can include up to hundreds of thousands of structures covering millions of atomic environments to ensure that all relevant features of the potential energy surface are well represented. The handling of such large data sets is nowadays becoming one of the main challenges in the construction of MLPs. In this paper we present a method, the bin-and-hash (BAH) algorithm, to overcome this problem by enabling the efficient identification and comparison of large numbers of multidimensional vectors. Such vectors emerge in multiple contexts in the construction of MLPs. Examples are the comparison of local atomic environments to identify and avoid unnecessary redundant information in the reference data sets that is costly in terms of both the electronic structure calculations as well as the training process, the assessment of the quality of the descriptors used as structural fingerprints in many types of MLPs, and the detection of possibly unreliable data points. The BAH algorithm is illustrated for the example of high-dimensional neural network potentials using atom-centered symmetry functions for the geometrical description of the atomic environments, but the method is general and can be combined with any current type of MLP.
Highlights
The required information is obtained from sampling the potential energy surface (PES) at discrete points, i.e. particular atomic configurations, utilizing comparably demanding electronic structure methods such as density functional theory (DFT)[4,5]
Notice that this speedup increases as the data set size increases, since the naive approach scales as the square of the data set size but the bin and hash (BAH) scales linearly
In this work we have presented a bin and hash method, which allows a computationally very efficient comparison of a large number of geometric atomic environments, which are used in the construction of modern machine learning potentials
Summary
Machine-learning (ML) has become an important tool for the development of atomistic potentials, with a wide variety of applications in chemistry, physics and materials science.[1–3]. III D shows how the method can be utilized to find similar atomic environments and contradicting information in a data set Overall these applications are examples for the well known and complex problem of efficiently finding distances and nearest neighbors in points belonging to multi-dimensional data. Previous approaches include making use of complex binary tree data structures such as kDtrees[62,63], that can efficiently store data points according to their mutual distance in multi-dimensional space and rapidly reduce a search space due to their binary structure; and dimensionality reduction algorithms such as principal component analysis (PCA)[64,65] and SketchMap[36] that instead reduce the size of the space under consideration.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have