Abstract
The physical and cost limitations of seismic data acquisition often result in the spatial undersampling of data, which has a detrimental impact on subsequent data processing. Seismic data reconstruction plays a crucial role in recovering missing traces arising from the acquisition process. In recent years, deep learning (DL) has emerged as an intelligent solution for this purpose. We introduce an innovative approach called the weighted-attentive DL framework for unsupervised 3D seismic data reconstruction, which combines an attentive transformer network (ATNet) with a conventional projection onto convex sets (POCS) algorithm. Our framework follows the plug-and-play concept, requiring only the original subsampled data to achieve robust performance. It accomplishes this by iteratively updating ATNet parameters, wherein ATNet serves as the reconstruction operator during the DL process. This approach allows us to simultaneously recover missing traces and filter out random noise, with a key feature being its enhanced attention to the observed signal compared with convolutional neural networks. In addition, we incorporate the POCS algorithm into our framework to introduce a linear decay weight for the aliasing space containing noise and signal during the reconstruction process. We rigorously evaluate our method against four existing approaches: adaptive POCS, fast dictionary learning sequential generalized K-means, optimally damped rank-reduction, and DenseNet methods. Our comparative experiments, conducted on synthetic and field data examples, clearly demonstrate the substantial improvements achieved by our method in terms of reconstruction and denoising when compared with the four benchmarked methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.