Abstract

Ground penetrating radar (GPR) imaging [1] is a well assessed non-destructive technology exploited in many applicative contexts such as structural assessment [2], cultural heritage [3], and others. However, GPR raw data are difficult to interpret since targets do not appear with their geometrical shape but as diffraction hyperbolas because of the probe-target relative motion during the measurement. Linearized Microwave Tomography (MWT) approaches allow retrieving qualitative maps of the probed scene in terms of position and approximate geometry of the targets, thus providing more easily interpretable image of the investigated scenario. Unfortunately, they do not provide quantitative information about the targets in terms of permittivity/conductivity profiles. Recently, deep learning (DL) techniques have been proposed to face this problem. DL approaches are data-driven methods that use proper training data to learn mapping the input data into the desired output. As regards quantitative GPR imaging, different approaches have been proposed in literature, e.g. see [4], [5]. In this contribution, we adopt the well-known Convolutional Neural Network (CNN) U-NET to tackle the quantitative GPR imaging problem. As a novel point compared to the previous works on DL-based quantitative GPR imaging, the network takes in input the linear MWT images instead of the GPR raw data. Such an approach is expected to simplify the learning process as pointed out in [6]. Full-wave simulated data are used for the training of the network and numerical experiments are reported as preliminary assessment of the effectiveness of the proposed strategy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.