Abstract

Abstract In order to properly manage the depletion of a petroleum reservoir one must understand its physical characteristics and their relationship to production performance. Often the challenge is to interpret an extensive amount of available information. This paper presents an approach to this common petroleum engineering problem using statistical analysis. Statistical analysis can provide an improved understanding of any reservoir. Beginning with the compilation of a comprehensive database, intrinsic relationships among physical parameters and production are characterized. Mapping of statistically processed data allows field-wide interpretation and visual comparison with the geological model. Probabilities of production success can be mathematically determined and used to focus optimization efforts on areas of the reservoir with favourable characteristics. The techniques provide an unbiased appraisal of the available data that can lead to re-examination of assumptions about the underlying mechanisms governing production behaviour. Technical hypotheses can be tested for consistency by determining if expected correlations are present in the data. A recent evaluation of the performance of a heavy oil waterflood at Golden Lake, Saskatchewan, is discussed to illustrate the application of the method. Modification of well completion practices and the reservoir depletion strategy have resulted from the study, and favourable areas for infill drilling have been identified. Introduction The initial challenge when analysing the performance of a petroleum reservoir is often to marshal and interpret an overwhelming amount of raw data. Technological advances in reservoir characterization and data processing have provided access to an abundant supply of numerical data. Coincidentally, the importance of understanding reservoir heterogeneity to maximize depletion efficiency is increasingly recognized. Complex variation of fluid and rock properties within reservoirs is the norm, demanding more detailed description and sophisticated technical analysis. Techniques for processing large databases and distilling critical knowledge are therefore of growing interest to the petroleum engineer. The data overload problem may be addressed by arbitrarily discarding information or by working with averaged values. A compromise between these shortcuts and utilization of all of the data available is usually necessary; however, the quality of technical analysis may be sacrificed by over simplification. A method of extracting the important information concealed in an extensive dataset is required. Determination of the significant factors controlling production success will usually suggest the proper course to optimize the field. Statistical analysis techniques are well suited to achieving these objectives. Statistical analysis of data is best regarded as a supplement to, not a replacement for, the engineering analysis typically undertaken to solve technical problems. As an integrated component of a study, it can generate ideas, provide direction, and confirm that the data actually supports the theoretical conclusions. The value added by statistical analysis comes from ensuring no important trends hidden in the data are overlooked, and from the improved level of confidence in the results. Petro-Canada initiated a comprehensive technical review of the Golden Lake heavy oil field in 1994. The scope of the work included standard waterflood analysis, laboratory experiments, field tests, and numerical simulation in conjunction with the statistical analysis discussed in this paper.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call