Abstract
Three-dimensional seismic interpretation has been significantly accelerated by the recent implementation of various machine learning algorithms, particularly supervised convolutional neural networks (CNNs). CNNs are able to parse seismic data from the perspective of pattern recognition, extract seismic features at multiple scales, and provide acceptable predictions. The performance of a supervised CNN in seismic image interpretation greatly depends on its training labels, which are usually a set of seismic sections with expert annotations. Among the thousands of sections in a typical 3D seismic cube, effectively selecting those that are most representative is a challenging task. A common approach is to have an experienced interpreter visually screen all of the sections and make their selection. To improve the efficiency of training section selection and to avoid introducing bias from manual screening, this work proposes an automated-active-learning (AutoAL) workflow for interactive seismic image interpretation. This enables quantitatively evaluating the machine performance after one iteration and efficiently recommending the sections to be labeled for learning in the next iteration. The added value of the proposed approach is validated through an application of seismic facies classification to the Parihaka data set in the northwest part of offshore Taranaki Basin in New Zealand. Starting from four initial sections, in three iterations the proposed AutoAL automatically recommends 14 from more than 1300 sections as training data. This improves the accuracy and average F1 of the machine prediction over 0.9. Comparisons demonstrate better prediction by the proposed scheme over traditional training section selection schemes, such as manual screening and clustering-based recommendation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.