Abstract

Machine learning techniques are attractive options for developing highly-accurate analysis tools for nanomaterials characterization, including high-resolution transmission electron microscopy (HRTEM). However, successfully implementing such machine learning tools can be difficult due to the challenges in procuring sufficiently large, high-quality training datasets from experiments. In this work, we introduce Construction Zone, a Python package for rapid generation of complex nanoscale atomic structures which enables fast, systematic sampling of realistic nanomaterial structures and can be used as a random structure generator for large, diverse synthetic datasets. Using Construction Zone, we develop an end-to-end machine learning workflow for training neural network models to analyze experimental atomic resolution HRTEM images on the task of nanoparticle image segmentation purely with simulated databases. Further, we study the data curation process to understand how various aspects of the curated simulated data—including simulation fidelity, the distribution of atomic structures, and the distribution of imaging conditions—affect model performance across three benchmark experimental HRTEM image datasets. Using our workflow, we are able to achieve state-of-the-art segmentation performance on these experimental benchmarks and, further, we discuss robust strategies for consistently achieving high performance with machine learning in experimental settings using purely synthetic data. Construction Zone and its documentation are available at https://github.com/lerandc/construction_zone.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call