Abstract

Seismic data denoising (SDD) plays an important role in obtaining high-quality data for subsequent seismic imaging and inversion. However, traditional SDD methods still present several disadvantages such as parameter selection difficulties, high computational costs, and a strong dependence on the experience of processing personnel. Deep learning (DL)-based SDD methods can overcome these issues to a certain extent, having the advantages of high computational efficiency and negligible dependency on the experience of processing personnel. However, existing DL-based SDD methods typically adopt a specific single-network architecture, which can be further improved in terms of training efficiency and denoising quality. In this paper, we propose a two-step prediction denoising method based on the combination of two identical or different network architectures for improving the accuracy and efficiency of DL-based SDD methods. Compared with existing SDD methods, the two-step prediction denoising method is more capable of extracting feature information at multiple scales because multiple network architectures are combined in a single denoising task, thus improving the training efficiency and denoising accuracy. In this study, we implemented the two-step prediction denoising method using an improved denoising convolutional neural network (DnCNN) and a nested U-shaped fully convolutional network (U-Net++). Several synthetic numerical examples based on three different types of noise were denoised, with the results indicating that the proposed method effectively improves the accuracy and efficiency of the DL-based SDD method. Finally, a test using field data was performed to demonstrate the real-world applicability of the novel two-step prediction denoising method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.