Abstract
Seismic acoustic impedance bridges the gap between post-stack seismic reflection data and reservoir parameters such as lithology and porosity, and hence it plays an important role in stratigraphic interpretation. Due to sedimentation and propagation effects, both seismic records and impedance are a type of spatio-temporal data, i.e., there should be coupling between adjacent data points. However, most deep learning inversion methods only consider local shape information, ignore the time-series characteristics of the data, and are demanding on the training data, which makes inversion more difficult and leads to low inversion accuracy. For this reason, we develop a spatio-temporal neural network (STNN) to perform post-stack impedance inversion. The network consists mainly of a convolutional neural network (CNN) block and a recurrent neural network (RNN) block in series. Thus, the proposed method can take full advantage of CNN and RNN to capture the dynamics and correlations of seismic series in the spatio-temporal levels, yielding more continuous and stable results. We use an overthrust model example and an actual data case to test the performances of the STNN and demonstrate its advantages over traditional deep learning (i.e., CNN) based impedance inversion. Through a series of numerical experiments, we find that STNN produces more geologically reliable results, which not only ensure coupling relationships between adjacent points in the vertical direction, but also reveal well the stratigraphic variations in the lateral direction.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.