Abstract
Atom-centered neural network (ANN) potentials have shown high accuracy and computational efficiency in modeling atomic systems. A crucial step in developing reliable ANN potentials is the proper selection of atom-centered symmetry functions (ACSFs), also known as atomic features, to describe atomic environments. Inappropriate selection of ACSFs can lead to poor-quality ANN potentials. Here, we propose a gradient boosting decision tree (GBDT)-based framework for the automatic selection of optimal ACSFs. This framework takes uniformly distributed sets of ACSFs as input and evaluates their relative importance. The ACSFs with high average importance scores are selected and used to train an ANN potential. We applied this method to the Ge system, resulting in an ANN potential with root-mean-square errors (RMSE) of 10.2 meV/atom for energy and 84.8 meV/Å for force predictions, utilizing only 18 ACSFs to achieve a balance between accuracy and computational efficiency. The framework is validated using the grid searching method, demonstrating that ACSFs selected with our framework are in the optimal region. Furthermore, we also compared our method with commonly used feature selection algorithms. The results show that our algorithm outperforms the others in terms of effectiveness and accuracy. This study highlights the significance of the ACSF parameter effect on the ANN performance and presents a promising method for automatic ACSF selection, facilitating the development of machine learning potentials.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have