Abstract

Layered, two-dimensional (2D) materials are promising for next-generation photonics devices. Typically, the thickness of mechanically cleaved flakes and chemical vapor deposited thin films is distributed randomly over a large area, where accurate identification of atomic layer numbers is time-consuming. Hyperspectral imaging microscopy yields spectral information that can be used to distinguish the spectral differences of varying thickness specimens. However, its spatial resolution is relatively low due to the spectral imaging nature. In this work, we present a 3D deep learning solution called DALM (deep-learning-enabled atomic layer mapping) to merge hyperspectral reflection images (high spectral resolution) and RGB images (high spatial resolution) for the identification and segmentation of MoS2 flakes with mono-, bi-, tri-, and multilayer thicknesses. DALM is trained on a small set of labeled images, automatically predicts layer distributions and segments individual layers with high accuracy, and shows robustness to illumination and contrast variations. Further, we show its advantageous performance over the state-of-the-art model that is solely based on RGB microscope images. This AI-supported technique with high speed, spatial resolution, and accuracy allows for reliable computer-aided identification of atomically thin materials.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.