Abstract
SUMMARY The elastic properties of the subsurface, such as density, P-velocity and S-velocity, can be estimated in pre-stack seismic inversion. In recent research, the deep neural network is widely used in pre-stack seismic inversion for its strong non-linear fitting and feature extraction ability. However, the label data is generally inadequate due to high drilling costs and strong data-sharing barriers in the field of exploration geophysics. In order to reduce the dependence of network performance on label data and ensure the accuracy of inversion mostly, semi-supervised learning is adopted. Here, we develop a cooperative multinetworks semi-supervised pre-stack seismic inversion method. In the cooperative multinetworks inversion framework, the inversion network, mapping network, and modification network are adopted to complete the inversion task cooperatively. A forward network is constructed to automatically generate seismic data from density, P-velocity and S-velocity, which can assist the above networks to complete semi-supervised learning. Compared with some published deep learning pre-stack inversion methods, the spatio-temporal correlation of data can be fully mined, the prior geological structure and low-frequency information can be utilized effectively, and reflectivity is adopted as an intermediate output parameter to improve the robustness of the method. The experiments on the Marmousi2 model demonstrate that cooperative multinetworks semi-supervised inversion strategy is superior to conventional semi-supervised inversion methods in both inversion accuracy and antinoise performance. In addition, the susceptibility experiments of the initial model indicate that the proposed method can maintain a high inversion accuracy with little effective information in the initial model. Finally, the proposed method is successfully applied to the field data and obtains a high-resolution inversion result.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.