Abstract

The increasing attention to seismic data collection under complex geological conditions is attributed to the utilization of distributed acoustic sensing (DAS), which has gained significant prominence due to the rapid development of exploration techniques. However, the presence of background noise in DAS has adverse effects on the quality of DAS data and significantly affects the accuracy of the subsequent migration and imaging processing. In addition, these interferences have a different generation mechanism from random noise in geophone-acquired seismic data, resulting in the degradation of traditional denoising techniques. Recent advancements have introduced convolutional neural networks (CNNs) for DAS background noise attenuation, showcasing their superiority over conventional techniques in terms of denoising capability and processing accuracy. Unfortunately, most CNN-based methods only have a single-scale architecture and cannot precisely accomplish denoising tasks when confronted with weak reflection signals in DAS records. In this research, we introduce a residual dense-connection attention network called RDA-Net to further augment the denoising ability. More precisely, the multi-scale structure in the backbone of RDA-Net is employed to capture informative features from the DAS data. Simultaneously, we employ a self-enhanced spatial attention module to refine and strengthen the primary features, thereby positively impacting feature representation and denoising performance. Furthermore, dense connections are also leveraged to reinforce the feature-interaction capability and effective features. Compared with typical conventional methods and CNN-based frameworks, RDA-Net can improve the signal-to-noise ratio over 24 dB and show superiority in weak signal recovery and intense DAS noise attenuation, both for synthetic and field records.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.