Abstract

This article presents a novel Artificial Intelligence (AI) workflow to enhance drilling performance by mitigating the adverse impact of drill-string vibrations on drilling efficiency. The study employs three supervised machine learning (ML) algorithms, namely the Multi-Layer Perceptron (MLP), Support Vector Regression (SVR), and Regression Decision Tree (DTR), to train models for bit rotation (Bit RPM), rate of penetration (ROP), and torque. These models combine to form a digital twin for a drilling system and are validated through extensive cross-validation procedures against actual drilling parameters using field data.The combined SVR - Bit RPM model is then used to categorize torsional vibrations and constrain optimized parameter selection using the Particle Swarm Optimization block (PSO). The SVR-ROP model is integrated with a PSO under two constraints: Stick Slip Index (SSI<0.05) and Depth of Cut (DOC<5 mm) to further improve torsional stability. Simulations predict a 43% increase in ROP and torsional stability on average when the optimized parameters WOB and RPM are applied. This would avoid the need to trip in/out to change the bit, and the drilling time can be reduced from 66 to 31 h.The findings of this study illustrate the system's competency in determining optimal drilling parameters and boosting drilling efficiency. Integrating AI techniques offers valuable insights and practical solutions for drilling optimization, particularly in terms of saving drilling time and improving the ROP, which increases potential savings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call