Abstract
Traditional ground-penetrating radar (GPR) data inversion leverages iterative algorithms that suffer from high computation costs and low accuracy when applied to complex subsurface scenarios. Existing deep learning-based methods focus on the ideal homogeneous subsurface environments and ignore the interference due to clutters and noise in real-world heterogeneous environments. To address these issues, a two-stage deep neural network (DNN), called DMRF-UNet, is proposed to reconstruct the permittivity distributions of subsurface objects from GPR B-scans under heterogeneous soil conditions. In the first stage, a U-shape DNN with first multi-receptive-field convolution (MRF-UNet1) is built to remove the clutters due to inhomogeneity of the heterogeneous soil. Then, the denoised B-scan from MRF-UNet1 is combined with the noisy B-scan to be inputted to the DNN in the second multi-receptive-field convolution (MRF-UNet2). MRF-UNet2 learns the inverse mapping relationship and reconstructs the permittivity distribution of subsurface objects. To avoid information loss, an end-to-end training method combining the loss functions of two stages is introduced. A wide range of subsurface heterogeneous scenarios and B-scans are generated to evaluate the inversion performance. The test results in the numerical experiment and the real measurement show that the proposed network reconstructs the permittivities, shapes, sizes, and locations of subsurface objects with high accuracy. The comparison with existing methods demonstrates the superiority of the proposed methodology for the inversion under heterogeneous soil conditions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.