Abstract
Accurately estimating seismic volumetric dip is a crucial task for subsequent seismic processing and interpretation. The traditional window based dip estimation methods, such as gradient structure tensor (GST) and semblance based multiple window scanning strategy, usually struggle with complicated geological structures containing multiple reflections with different dips in the analysis window. Recently, deep learning has been utilized to estimate seismic volumetric dips. However, dip labels used for current convolutional neural network (CNN) based methods are usually created using traditional dip estimation methods, which are not the ground truth, or from specific geological structure models, which only cover a part of typical geological structures. We propose a supervised deep learning model to improve the accuracy of seismic dip estimation by integrating realistic synthetic data sets. Firstly, we propose a synthetic seismic data generation workflow for seismic volumetric dip estimation, which aims to simulate geological features from real seismic data. We then create numerous unique synthetic seismic images with realistic and diverse structural features by using the proposed workflow. To ensure that generated dip labels are exactly accurate and can be regarded as the ground truth, we create synthetic seismic images by deforming horizontal reflection images according to corresponding dip labels. We finally train an end-to-end supervised deep learning model, which focuses on the parallel processing and the extraction of various feature maps simultaneously, utilizing the created realistic synthetic data. The applications on synthetic and 3D real seismic data (Netherlands F3 block) effectually demonstrate the validity and effectiveness of our proposed model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.