Abstract
Scanning tunneling microscopy (STM) is a powerful technique that provides the ability to manipulate and characterize individual atoms and molecules with atomic-level precision. However, the processes of scanning samples, operating the probe, and analyzing data are typically labor-intensive and subjective. Deep learning (DL) techniques have shown immense potential in automating complex tasks and solving high-dimensional problems. In this study, we developed an autonomous STM framework powered by DL to enable autonomous operations of the STM without human interventions. Our framework employs a convolutional neural network (CNN) for real-time evaluation of STM image quality, a U-net model for identifying bare surfaces, and a deep Q-learning network (DQN) agent for autonomous probe conditioning. Additionally, we integrated an object recognition model for the automated recognition of different adsorbates. This autonomous framework enables the acquisition of space-averaging information using STM techniques without compromising the high-resolution molecular imaging. We achieved measuring an area of approximately 1.9 μm2 within 48 h of continuous measurement and automatedly generated the statistics on the molecular species present within the mesoscopic area. We demonstrate the high robustness of the framework by conducting measurements at the liquid nitrogen temperature (∼78 K). We envision that the integration of DL techniques and high-resolution microscopy will not only extend the functionality and capability of scanning probe microscopes but also accelerate the understanding and discovery of new materials.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.