This article, written by JPT Technology Editor Judy Feder, contains highlights of paper SPE 193680, “Implementation and Assessment of Production Optimization in a Steamflood Using Machine-Learning-Assisted Modeling,” by Pallav Sarma, SPE, Ken Lawrence, Yong Zhao, Stylianos Kyriacou, and Delon Saks, Tachyus, prepared for the 2018 SPE International Heavy Oil Conference and Exhibition, Kuwait City, 10–12 December. A physics-based model augmented by machine learning proved its ability to optimize a steam-injection plan in a shallow, heavy-oil field in the San Joaquin Basin of California. The model and a case study to validate its predictive capabilities were described in the previously published paper SPE 185507. Paper SPE 193680 updates the previous case study and presents the results of actual implementation of an optimized steam-injection plan based on the model framework. Introduction The goal of steamflood modeling and optimization is to determine the optimal spatial and temporal distribution of steam injection to maximize future recovery or field economics. Accurate modeling of thermodynamic and fluid-flow mechanisms in the wellbore, reservoir layers, and overburden can be prohibitively resource-intensive for operators, who instead often default to simple decline-curve analysis and operational rules of thumb. The physics-based model described in this paper allows operators to leverage readily available field data to infer reservoir dynamics from first principles. Production, injection, temperature, steam quality, completion, and other engineering data from an active steamflood are continuously assimilated into the model using an ensemble Kalman filter (EnKF). The model is then used to optimize steam-injection rates to maximize or minimize multiple objectives such as net present value (NPV), injection cost, and others, using large-scale evolutionary optimization algorithms. The solutions are low-order and continuous scale, rather than discretized, so modeling, forecasting, and optimization are significantly faster than with traditional simulation. Although steam-floods in the shallow, heavy-oil fields of the San Joaquin Valley have been very successful, the scale of many of these steamfloods also provides optimization opportunities. The basin contains hundreds or even thousands of wells with significant lateral and vertical reservoir heterogeneity, spatial variation of steam quality, varying historical completion quality and methods, and different levels of pattern maturity across the fields. As such, steamflood operations have multiple control variables that can be optimized. Redistribution of steam in existing injectors is one of the most important control variables to optimize in a steam-flood to account for the factors mentioned previously. A successful redistribution optimization produces a list of time-varying injection-rate changes for each injector that can maximize the overall effect of steam on incremental oil production. Additionally, optimizing steam cut across already-mature pat-terns can reduce operational costs. For these crucial decisions, operators typically rely on semiquantitative approaches such as decline-curve analysis or simple analytical models. These methods are capable of using only a small portion of available data and function as rules of thumb, resulting, at best, in qualitatively optimized reservoir management decisions. On the other extreme, some oil companies use sophisticated predictive modeling tools. Reservoir simulation is the most advanced technique available today in that it is capable of integrating disparate data sources and predicting over long-time horizons. While reservoir simulation is an excellent tool for field studies and long-term planning, certain limitations prevent operators from leveraging simulation for day-to-day decision-making.
Read full abstract