This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 187462, “Bridging the Gap Between Subsurface and Surface Disciplines—A Tool for the Modern Facilities Engineer,” by Z. Cristea, SPE, Stochastic Asset Management, and T. Cristea, Independent, prepared for the 2017 SPE Annual Technical Conference and Exhibition, San Antonio, Texas, USA, 9–11 October. The paper has not been peer reviewed. This paper presents an unbiased stochastic data-driven work flow in which surface and subsurface uncertainties are accounted for and their effects on facilities design and operational decisions are quantified. Unlike the traditional approach in facilities design where the most-conservative values are used typically as design input variables, the proposed work flow accounts for life-cycle variability and correlations of relevant input data. Introduction Traditional, deterministic facilities-design methodologies, based on deterministic (single-point) conservative conditions and design margins, might not recognize the full spectrum of operational conditions throughout a field’s life cycle, resulting in significant residual risk and waste of resources during operations. Despite efforts to standardize the process of project delivery as much as possible, industry projects remain heavily customized, intermittent (high-variety, low-volume) processes, with a high rate of diversification and complexity. On the other hand, facilities operations are expected to be high-volume, low-variety, ideally continuous processes. In the proposed work flow, deterministic models are established to account for dependencies between design input variables (static variables such as bottomhole pressure and temperature) and the desired objective (static results, such as the chemical-injection rate). However, in field situations, the analyzed variables change because of subsurface and surface events with different levels of uncertainty (e.g., condensate banking, lean gas injection, water breakthrough). Stochastic algorithms are used to create probability distribution functions (PDFs) for all analyzed design input variables (stochastic variables). Stochastic algorithms then are applied on the deterministic model, sampling from the previously defined probability distributions. Stochastic results are assembled into insightful charts and used to analyze the most-relevant variables and their correlations affecting the model objectives. Example Case Equipment and Processes. A deterministic model is created to calculate the baseline methanol-injection rate (QMEOH) required to mitigate the hydrate-formation risk in a wet-gas field. Both the deterministic and subsequent probabilistic modeling work is performed in a system designed for statistical computation and graphics. Design-basis input data (static variables), considered to be conservative and covering all operational scenarios, are used in the deterministic model. Furthermore, the deterministic model assumes chemical injection as a mass/continuous process. In the field, parameters such as bottomhole pressure and flow rates of different phases change over time. These changes are caused by subsurface and surface events that can be predicted with different levels of uncertainty. PDFs are created for all relevant parameters. Similar probability distributions are created for all relevant variables.
Read full abstract