Abstract

Abstract This paper describes the full cycle of 4D seismic data integration comprised of workflows related to 4D data analysis, quality control of reservoir models and reservoir model updating using both 4D seismic and well production data. These workflows are applied to a deepwater field, where high quality 4D seismic data is available. In the first step, we analyze 4D seismic data and extract multiple attributes to image changes in reservoir properties. Next, we apply different workflows which link 4D seismic data with the reservoir model. Finally, we update the reservoir model automatically by simultaneously honoring the 4D seismic and well production data. We use a novel approach which incorporates 4D seismic amplitude differences without explicitly modeling the full physics in a joint history matching workflow. Introduction Reservoir monitoring using 4D seismic data is becoming an increasingly important tool for reservoir management (Calvert, 2005). Nevertheless, the quantitative integration of both 4D seismic and historical production data into reservoir simulation models is a challenging task, which recently has become an active direction of research. Huang et al. (1997) applied a stochastic optimization method to minimize the mismatch between synthetic and observed seismic data over a reservoir to achieve simultaneous history-matching of 4D seismic and well-by-well production data. Landa (1997) proposed a gradient-based method to integrate both 4D seismic and pressure transient data. Stephen et al. (2006) developed a workflow for multiple-model history matching through simultaneous comparison of spatial information extracted from 4D seismic data as well as individual well-production data. Employing the Neigbourhood Algorithm (NA) as the sampling engine this workflow was applied to the North Sea Schiehallion field. Skjervheim et al. (2007) presented a version of the Ensemble Kalman Filter (EnKF) for continuous model updating capable to match a combination of production and 4D seismic data. They tested the method on a synthetic case and a North Sea field case. Jin et al. (2007, 2008) proposed the combination of the Very Fast Simulated Annealing (VFSA) method with pilot-point parameterization to solve the 4D seismic history-matching inverse problem and applied the workflow to a synthetic case. Castro (2006) proposed a probabilistic approach to perturb a high-resolution 3D geocellular model for integrating data from diverse sources, such as well logs, geological information, 3D/4D seismic, and production data. This workflow was successfully applied on a reservoir of the Oseberg field. Jin et al. (2011) also proposed a flood front based 4D seismic history matching workflow. In this paper, we present a case study of the full cycle of 4D seismic data integration ranging from basic and qualitative 4D attribute analysis to the advanced 4D seismic history matching workflow. 4D seismic attributes analysis This workflow provides an analysis of 4D seismic differences related to changes in reservoir properties. First, timeshifts between baseline and monitor 3D seismic volumes are computed through cross-correlation. Some initial data preparation usually takes place before the cross-correlation of the datasets, including automatic gain control and trace stacking for signal to noise ratio enhancement. Next, the computed timeshift is removed from the monitor survey in order to obtain meaningful 4D difference attributes. At that point, 4D seismic attributes can be extracted from a given gate around time horizons of interest - usually reservoir tops. For reservoirs with large lateral thickness change, top and base reservoir horizons should be used for attribute calculation. Different types of attributes can be extracted, such as the root mean square (RMS), the normalized RMS difference (NRMSD), etc.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call