Abstract

Acquiring continuous spatial data, e.g., spatial ground motion, is essential to assess the damaged area and appropriately assign rescue and medical teams. Therefore, spatial interpolation methods have been developed to estimate the value of unobserved points linearly from neighbor observed values, i.e., inverse distance weighting and Kriging. Meanwhile, realistic spatial continuous environmental data with various scenarios can be generated by 3-D finite difference methods using a high-resolution structure model. These enable to collect supervised data even for unobserved points. Therefore, this paper proposes a framework of supervised spatial interpolation and applies highly advanced deep inpainting methods, where spatially distributed observed points are treated as masked images and non-linearly expanded through convolutional encoder–decoder networks. However, the property of translation invariance would avoid locally fine-grained interpolation because the relation between the target and surrounding observation points varies among regions owing to their topography and subsurface structure. To overcome this issue, this paper proposes introducing position-dependent partial convolution, where kernel weights are adjusted depending on their position on an image based on the trainable position-feature map. The experimental results show the effectiveness of the proposed method, called Position-dependent Deep Inpainting Method, using toy and ground-motion data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.