Abstract

Gravels are widely distributed in the Baikouquan formation in the Mabei area of the Junggar Basin. However, conventional logging methods cannot quantitatively characterize gravel development, which limits the identification of lithology, structure, and sedimentary facies in this region. This study proposes a new method for automatically identifying gravels from electric imaging images and calculating gravel parameters utilizing the salient object detection (SOD) network. Firstly, a SOD network model (U2-Net) was constructed and trained using electric imaging data from the Baikouquan formation at the Mahu Sag. The blank strips in the images were filled using the U-Net convolutional neural network model. Sample sets were then prepared, and the gravel areas were labeled in the electric imaging images with the Labelme software in combination with image segmentation and human–machine interaction. These sample sets were used to train the network model, enabling the automatic recognition of gravel areas and the segmentation of adhesive gravel regions in the electric imaging images. Based on the segmented gravel results, quantitative evaluation parameters such as particle size and gravel quantity were accurately calculated. The method’s validity was confirmed through validation sets and actual data. This approach enhances adhesive area segmentation’s accuracy and processing speed while effectively reducing human error. The trained network model demonstrated an average absolute error of 0.0048 on test sets with a recognition accuracy of 83.7%. This method provides algorithmic support for the refined evaluation of glutenite reservoir logging.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.