Abstract

ABSTRACTUpdating of reservoir models by history matching of 4D seismic data along with production data gives us a better understanding of changes to the reservoir, reduces risk in forecasting and leads to better management decisions. This process of seismic history matching requires an accurate representation of predicted and observed data so that they can be compared quantitatively when using automated inversion. Observed seismic data is often obtained as a relative measure of the reservoir state or its change, however. The data, usually attribute maps, need to be calibrated to be compared to predictions. In this paper we describe an alternative approach where we normalize the data by scaling to the model data in regions where predictions are good. To remove measurements of high uncertainty and make normalization more effective, we use a measure of repeatability of the monitor surveys to filter the observed time‐lapse data.We apply this approach to the Nelson field. We normalize the 4D signature based on deriving a least squares regression equation between the observed and synthetic data which consist of attributes representing measured acoustic impedances and predictions from the model. Two regression equations are derived as part of the analysis. For one, the whole 4D signature map of the reservoir is used while in the second, 4D seismic data is used from the vicinity of wells with a good production match. The repeatability of time‐lapse seismic data is assessed using the normalized root mean square of measurements outside of the reservoir. Where normalized root mean square is high, observations and predictions are ignored. Net: gross and permeability are modified to improve the match.The best results are obtained by using the normalized root mean square filtered maps of the 4D signature which better constrain normalization. The misfit of the first six years of history data is reduced by 55 per cent while the forecast of the following three years is reduced by 29 per cent. The well based normalization uses fewer data when repeatability is used as a filter and the result is poorer. The value of seismic data is demonstrated from production matching only where the history and forecast misfit reductions are 45% and 20% respectively while the seismic misfit increases by 5%. In the best case using seismic data, it dropped by 6%. We conclude that normalization with repeatability based filtering is a useful approach in the absence of full calibration and improves the reliability of seismic data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.