Abstract
This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 167836, ’From Market-Basket Analysis to Wellhead Monitoring: Use of Events To Increase Oil Recovery,’ by R. Bailey, Z. Lu, S. Shirzadi, and E. Ziegel, BP, prepared for the 2014 SPE Intelligent Energy Conference and Exhibition, Utrecht, The Netherlands, 1-3 April. The paper has not been peer reviewed. An operator has launched a successful, limited-scope deployment of a nonparametric capability known as event detection and association (EDA). Variations from the original analytical approach may be necessary before the EDA techniques can be used to solve problems in the field. EDA comprises part of a suite of tools collectively referred to as topdown waterflood (TDWF) diagnostic and optimization. By studying the time-based correlation of these input and output events, the basic TDWF capacitance/resistivity model is either confirmed or improved. Introduction It is almost an automatic response for the human mind, presented with time-series data such as controller strip charts, to seek features in those data, mentally label them as “events,” and seek correlations in time against other timelines. However, the human mind is not only highly capable, it is all too often fallible. It will favor connections with short time lags and events of the same shape, size, and duration, paying little attention to the influence on shape of variable scaling. It will also tend to associate variables for which it has a preconceived notion that relationships exist while failing to assess the statistical validity of the labeling process. The mind’s eye can be persuaded easily by superposition and purposeful or even spurious alignment of individual tags’ time traces. Such visual tricks help to find supporting evidence for a pattern. The real challenge is development of an automated, comprehensive, dispassionate, independent, statistically valid process. If all these objectives can be met, then many types of oilfield evaluations can be carried out using an event-based analysis. The Event-Based Approach A Brief Description. For an asset that is being studied, one must first choose the producer and injector wells to include in the study. These will be wells that are sufficiently well instrumented to provide a time-series data stream expected to contain events that are discernible from the unavoidable sources of measurement noise. Next, one will choose a time period to be the focus of the analysis. Recent data are more likely to be most instructive in regard to what is likely to happen next, but better-quality periods of data may exist. Then, one must choose the well attribute that will be used as the variable that will indicate the occurrence of an event. Typically, the choice will be an allocated (or possibly measured) flow, an aspect of production such as water cut or gas/oil ratio, or a direct intensive measurement such as a pressure or even a temperature. Similarly, injector wells will have some (typically) corresponding attribute, such as injection rate or some pressure. The event-based analysis begins with the marking of events for each producer and injector well. The authors have developed an automated event-marking algorithm. Once a complete set of injection and production events has been obtained, the process associates these events with each other. This is performed by considering an appropriate range of time delays that reflect the physical separation of the wells and the intervening reservoir properties. The result is a ranking of the connections considered possible between injectors and producers, together with their estimated time delays, and is summarized as an optimal score for each connection.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have