Articles published on History matching
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
2403 Search results
Sort by Recency
- Research Article
- 10.1007/s10596-026-10413-w
- Mar 3, 2026
- Computational Geosciences
- Paulo Henrique Ranazzi + 2 more
Abstract Ensemble-based methods have become the state-of-the-art approaches to reservoir data assimilation (RDA). In practical applications, however, they suffer from issues imposed by the limited ensemble size. Among others, one noticeable problem is significant sampling error in the sample covariance estimator when the ensemble size is substantially small relative to the dimensionality of an RDA problem. Therefore, in practical applications, enhancing the estimation accuracies of sample covariance matrices is crucial for improving the performance of ensemble-based data assimilation. In this article, we propose a novel approach, called the covariance scaling method, to mitigating sampling errors in sample covariance matrices. This approach aims to find the optimal regularization parameter that minimizes the difference between a true covariance and its sample estimate. In contrast to other similar methods in the literature, such as the covariance shrinkage method, covariance scaling can be applied to minimize the errors in approximating a generic covariance matrix, including the cross-covariance matrix, which is of particular interest to ensemble-based methods. In addition, since the optimal regularization parameter of covariance scaling depends on the true, yet unknown covariance matrix, we propose an approximate formula to calculate the regularization parameter based on some sample covariance and cross-covariance matrices, and we further extend this approximate formula to derive an alternative, tuning-free method for adaptive localization. The covariance scaling method was evaluated and compared with other similar techniques in several experiments, showing improved performance in terms of both cross-covariance estimation and ensemble data assimilation.
- Research Article
- 10.3390/app16052436
- Mar 3, 2026
- Applied Sciences
- Adewale Amosu + 7 more
The Panoma Field in the Hugoton Embayment, Kansas, has produced significant gas resources from thousands of wells perforating the Permian Chase and Council Grove Groups. Variability in gas production from these formations is controlled by facies-influenced petrophysical properties. The use of geological facies data in numerical modeling is often limited to delineating regions of interest without intrinsic use in estimating petrophysical properties. Machine learning provides opportunities to integrate facies data into the numerical model-building process. In this study, we employ facies data in optimizing a numerical model permeability matrix scaling parameter using Monte Carlo Simulation of Markov Switching Dynamic Regression and machine learning. Realizations of the scaling parameter are included in a machine learning facies prediction workflow to identify the parameter that maximizes facies prediction accuracy, with test accuracy as high as 83%. A 3D numerical model was constructed to represent the interlayered carbonate, shale, and non-marine sandstones facies typical of the Council Grove intervals. Multiple field development and completion scenarios were evaluated to maximize cumulative gas recovery and assess the role of facies distribution on reservoir performance. History matching results of historical gas production demonstrate strong coupling between facies distribution and the optimized permeability, emphasizing the importance of facies data integration in reservoir property modeling and gas production estimation in Permian reservoirs. This implies that probabilistically constrained permeability scaling using the Monte Carlo and machine learning workflow produces more realistic modeling compared to traditional approaches.
- Research Article
- 10.1038/s41598-026-40306-y
- Mar 2, 2026
- Scientific reports
- Wei Xiong + 4 more
Volume stimulation technique is currently the primary engineering approach for effectively enhancing shale gas productivity. However, in post-fracturing reservoirs, multi-media coupling is prominent, and fracture conductivity exhibits strong heterogeneity. This necessitates further characterization of heterogeneous spatial distribution of complex fractures, as well as multiple flow mechanism. This study develops a multiscale coupled matrix-fracture flow in porous media model based on the post-fracturing reservoir geometry to precisely describe production dynamics in fractured shale gas reservoirs. Heterogeneity in hydraulic fracture geometry and stimulated reservoir volume (SRV) permeability was simulated using continuous gradient functions. Analytical solutions for the multi-stage fractured shale gas reservoir model were derived through multi-linear flow model and perturbation methods. By integrating simulated annealing and particle swarm optimization algorithms, the C++ code was enhanced to achieve automated history matching and production forecasting. Results indicate that hydraulic fracture damage predominantly affects early-stage production, with near-tip fracture blocking leading to reduced overall gas production. Mid-to-late production is more significantly influenced by SRV permeability -gradual permeability variations exhibit minor impacts, whereas abrupt permeability changes substantially decrease total gas production. Based on an analysis of dynamic production data from two wells in the oilfield, this study achieves optimal matching of unknown reservoir parameters, thereby enabling reliable production performance prediction. The proposed model demonstrates high rationality and practicality in reservoir parameter inversion and shale gas well performance forecasting, providing novel insights to critical challenges in shale gas production.
- Research Article
- 10.1063/5.0311306
- Mar 1, 2026
- Physics of Fluids
- Tianrui Ye + 2 more
Surrogate reservoir models have emerged as efficient alternatives to approximate full-physics reservoir simulation while reducing computational costs. Even though existing studies have used synthetic models to test the feasibility of surrogate models, the application to actual reservoirs is limited. To employ the surrogate model in real-life history matching and field development optimization, key geological properties and frequent development operations need to be jointly considered. This study develops a full-scale surrogate reservoir model based on the Koopman neural operator (KNO) for actual three-dimensional (3D) oil reservoirs under waterflood operations. The model integrates static reservoir properties (permeability, net-to-gross ratio, and relative permeability) and dynamic development parameters (well placement and controls) as model inputs. A scientific sampling method that considers geological principles and monthly production operations ensures feasible and diverse training samples. The proposed 3D KNO architecture incorporates a learned grid layer to handle corner-point grids and leverages Fourier transforms to linearize nonlinear dynamics in high-dimensional space. After validating on a real oil field in China, the method demonstrates great capability in predicting pressure and saturation changes and oil production rates. By comparing the prediction performance with a baseline physics informed neural network model, the KNO model greatly outperforms the convolution neural network-based discrete mapping model. The prediction results by the KNO model align well with numerical simulations, which offers a robust and efficient tool for history matching and field development optimization.
- Research Article
- 10.1007/s10596-026-10411-y
- Feb 26, 2026
- Computational Geosciences
- Frode Lomeland + 3 more
Abstract History matching is an essential tool for providing reliable and accurate reservoir models that are useful for prediction and optimization and are routinely used in the petroleum industry. By updating the model based on available dynamic data, the modeling error and uncertainty are reduced systematically. Traditional history matching updates static parameters like porosity and permeability. This work focuses on relative permeability and shows how the relative permeability can be included in the history matching workflow systematically and efficiently, to further improve the predictive capabilities of the updated reservoir model. Our approach’s novelty is associating unique relative permeability curves, based on the LET formulation, to flow regions associated with each well. These flow regions are computed by solving tracer transport equations on the flow field and thus represent the streamlines connecting the injectors and the producers. The data used for the history matching is the production data in the wells. Therefore, flow regions are a natural choice for localizing the update to avoid over-fitting and ensemble collapse. The workflow is demonstrated on the Drogon reservoir model and shows an excellent match of the data both in the training and validation periods. The Drogon model is a synthetic reservoir model made openly available by Equinor for realistic testing of history matching and optimization workflows.
- Research Article
- 10.3390/app16052278
- Feb 26, 2026
- Applied Sciences
- Sadam Hussain + 3 more
This study develops a conceptual framework for characterizing reservoir architecture in multi-component, discrete systems using pressure transient analysis (PTA), aimed at calibrating inflow geometry prior to full-field dynamic simulation for subsurface gas storage applications such as CO2 and hydrogen. A secondary objective is to identify variations in permeability over time by analyzing flow capacity trends and evaluating the dynamic influence of faults and fractures. The analysis is based on a gas-condensate field comprising seven wells and four zones (A, B, C, D), using integrated dynamic datasets including extended well tests (EWTs), mud loss, production logs, and production data. Detailed interpretation of PX-1’s EWT indicated delayed re-pressurization and persistent under-pressure, suggesting a compartmentalized or transient system with limited gas-in-place connectivity. Four reservoir architecture concepts were developed: (1) lithology-dominated inflow, (2) structurally controlled inflow, (3) discrete, weakly connected compartments, and (4) transient-dominated systems with tight matrix GIIP. These concepts informed four reservoir models: matrix-only (M), areal heterogeneity (A), sparse bodies (B), and sparse networks (S). Application of these models across other wells revealed consistent localized KH (permeability–thickness product) behavior, with all models fitting short-duration data comparably. However, only sparse drainage models (B/S) adequately matched PX-1’s EWT response. PTA results confirm that well tests constrain KH locally but provide limited insight into large-scale reservoir architecture. EWTs may reach ~1 km, while shorter tests are confined to ~200–400 m, typically within one to two simulation grid blocks. This study demonstrates how integrating PTA with multi-scale data improves characterization of naturally fractured, tight carbonate reservoirs and supports reservoir simulation and history matching for hydrogen storage evaluation. Based on reservoir simulations, this study concluded that naturally fractured carbonate gas reservoirs can provide significant storage and injection capacities for underground hydrogen storage. This study exemplifies how to characterize the naturally fractured tight carbonate reservoirs by integrating multi-scale and multi-dimensional data such as PTA. Furthermore, this study assists in gridding for full-field reservoir models, for history matching and quantifying the potential of hydrogen storage in these complex reservoirs. The proposed workflow provides an uncertainty-bounded reservoir characterization framework and should not be interpreted as a complete field-design methodology for hydrogen storage. The modeling does not explicitly couple geomechanical fracture growth, hydrogen diffusion, long-term geochemical reactions, or caprock integrity degradation. Therefore, the presented storage scenarios represent technically feasible cases under defined assumptions. Comprehensive site-specific geomechanical and containment assessments are required prior to field-scale implementation.
- Research Article
- 10.3390/nano16040264
- Feb 17, 2026
- Nanomaterials (Basel, Switzerland)
- Ruihan Zhang + 4 more
Hydraulic fracturing is a critical technology for developing shale gas reservoirs, which are typical natural nanoporous media. However, the complex two-phase flow induced by fracturing fluid retention and the strong interference among hydraulic fractures introduce significant uncertainties to productivity forecasting. To address these challenges, this study proposes a transient productivity forecasting method to characterize fluid transport in fractured nanoporous media. This method introduces a gas-water two-phase pseudo-pressure function to reconstruct the flow equations, utilizing micro-segment discretization and the principle of superposition to accurately characterize pressure drop interference among fractures, enabling rapid dynamic productivity forecasting under realistic well trajectory conditions. The investigation reveals that while increasing fracture count, half-length, and permeability enhances productivity, these improvements exhibit significant diminishing marginal returns, indicating the existence of optimal economic thresholds for these engineering parameters. Conversely, elevated water saturation, skin factor, and stress sensitivity lead to a decline in productivity. Analysis of flow interference demonstrates that fractures at the wellbore extremities contribute significantly higher production than those in the central section due to reduced interference, while deviations in the wellbore trajectory further exacerbate production heterogeneity. Field application confirms that the proposed method achieves reliable production history matching under realistic well trajectories and accurately captures the typical three-stage production characteristics of shale gas wells, providing a robust basis for Estimated Ultimate Recovery (EUR) assessment and fracturing design optimization.
- Research Article
- 10.1021/acsomega.5c02722
- Feb 4, 2026
- ACS omega
- Yanlai Li + 4 more
The Archean metamorphic buried-hill oil reservoirs are characterized by the development of natural fractures in different scales, which makes them an important type of reservoir in the Liaodong Bay, China. The heterogeneity of fractures between vertical members could lead to significant differences in multiphase flow. In this study, to investigate the different characteristics of the relative permeability and capillary pressure curves during imbibition, spontaneous imbibition experiments using three representative core plugs collected from the upper and lower members of semiweathered crust and inner basement zones in the BH reservoir under reservoir conditions. Then, regression of the cumulative oil production was conducted to obtain the capillary pressure and relative permeability curves base on imbibition mathematical model. Experimental results show that oil production rate depends on the density of microfractures. The oil production rate of the sample JZS2 from the lower member is larger than the sample JZS1 from the upper member as the former developed more microfractures, which provides more channels for oil migration. Therefore, irreducible water saturation and residual oil saturation from the relative permeability curves are smaller for the sample JZS2. While for sample JZS3 with almost no microfractures developed, the oil production rate is much smaller as oil inside the matrix has to migrate longer distances. Hence, its two-phase region is the smallest among the three samples. Based on the regressed relative permeability and capillary pressure curves, numerical simulations in the BH reservoir were performed. History match results show that water-cut can be well matched by employing the three sets of relative permeability and capillary pressure curves instead of one. Three stages classified based on the oil production contribution ratios from the fracture system. Produced oil was mainly contributed from the fracture systems in Stage I, and gradually decreased in Stage II because less oil was left in the fracture, and water coning occurred subsequently. In Stage III, oil was produced from the matrix systems mainly. Overall, this study illustrated the effect of fracture heterogeneity on the imbibition process during water flooding and provided insights into designing water injection strategies for displacing remaining oil in fractured oil reservoirs.
- Research Article
- 10.1111/1752-1688.70086
- Feb 1, 2026
- JAWRA Journal of the American Water Resources Association
- Michael N Fienen + 4 more
ABSTRACT History matching of large hydrologic models is challenging due to data sparsity and non‐unique process combinations (and associated parameters) that can produce similar model predictions. We develop an ensemble‐based history matching (and uncertainty quantification) approach using an iterative ensemble smoother (iES) method for three cutouts of the National Hydrologic Model (NHM) and qualitatively compare the results and performance to the stepwise history matching approach. In the latter approach, subsets of parameters and observations were sequentially calibrated to a diverse range of observations to mitigate non‐uniqueness and local minima. In iES, localization simulates the same causal connections between parameters and observations without the need (and computational cost) of sequential history matching steps. iES uses a weighted sum‐of‐squared‐errors objective function which allows differential weighting of multiple data sources. Formal adoption of range observation also pushes results to within ranges of observation values rather than discrete values. Overall, the ensemble approach performs similarly to the stepwise approach. Both approaches performed poorly for the cutout representing a snowmelt‐dominated watershed, indicating a structural issue in the process representation of the model. The main advantage of iES is quantification of uncertainty in both the history matching and the predictions of interest.
- Research Article
- 10.3390/polym18030359
- Jan 29, 2026
- Polymers
- Mohammed A Khamis + 3 more
Declining recovery factors from mature oil fields, coupled with the technical challenges of recovering residual oil under harsh reservoir conditions, necessitate the development of advanced enhanced oil recovery (EOR) techniques. While promising, chemical EOR often faces economic and technical hurdles in high-salinity, high-temperature environments where conventional polymers like hydrolyzed polyacrylamide (HPAM) degrade and fail. This study presents a comprehensive numerical investigation that addresses this critical industry challenge by applying a rigorously calibrated simulation framework to evaluate a novel hybrid EOR process that synergistically combines an ionic liquid (IL) with HPAM polymer. Utilizing core-flooding data from a prior study that employed the same Berea sandstone core plug and Saudi medium crude oil, supplemented by independently measured interfacial tension and contact angle data for the same chemical system, we built a core-scale model that was history-matched with RMSE < 2% OOIP. The calibrated polymer transport parameters—including a low adsorption capacity (~0.012 kg/kg-rock) and a high viscosity multiplier (4.5–5.0 at the injected concentration)—confirm favorable polymer propagation and effective in -situ mobility control. Using this validated model, we performed a systematic optimization of key process parameters, including IL slug size, HPAM concentration, salinity, temperature, and injection rate. Simulation results identify an optimal design: a 0.4 pore volume (PV) slug of IL (Ammoeng 102) reduces interfacial tension and shifts wettability toward water-wet, effectively mobilizing residual oil. This is followed by a tailored HPAM buffer in diluted formation brine (20% salinity, 500 ppm), which enhances recovery by up to 15% of the original oil in place (OOIP) over IL flooding alone by improving mobility control and enabling in-depth sweep. This excellent history match confirms the dual-displacement mechanism: microscopic oil mobilization by the IL, followed by macroscopic conformance improvement via HPAM-induced flow diversion. This integrated simulation-based approach not only validates the technical viability of the hybrid IL–HPAM flood but also delivers a predictive, field-scale-ready framework for heterogeneous reservoir systems. The work provides a robust strategy to unlock residual oil in such challenging reservoirs.
- Research Article
- 10.1007/s11242-025-02275-0
- Jan 7, 2026
- Transport in Porous Media
- David Landa-Marbán + 4 more
Abstract We present a history matching (HM) workflow applied to the International FluidFlower benchmark study dataset, which features high-resolution images of CO $$_2$$ 2 storage in a meter-scale, geologically complex reservoir. The dataset provides dense spatial and temporal observations of fluid displacement, offering a rare opportunity to validate and enhance HM techniques for geological carbon storage (GCS). The combination of detailed experimental data and direct visual observation of flow behavior at this scale is novel and valuable. This study explores the potential and limitations of using experimental data to calibrate standard models for GCS simulation. By leveraging high-resolution images and resulting interpretations of fluid phase distributions, we adjust uncertain parameters and reduce the mismatch between simulation results and observed data. Simulations are performed using the open-source OPM Flow simulator, while the open-source Everest decision-making tool is employed to conduct the HM. After the HM process, the final simulation results show good agreement with the experimental CO $$_2$$ 2 storage data. This suggests that the system can be effectively described using standard flow equations, conventional saturation functions, and typical PVT properties for CO $$_2$$ 2 –brine mixtures. Our results demonstrate that the Wasserstein distance is a particularly effective metric for matching multi-phase, multi-component flow data. The entire workflow is implemented in a Python package named (Python OPM Flow FluidFlower), which organizes all functionality through a single input file. This design ensures reproducibility and facilitates future extensions of the study.
- Research Article
- 10.1038/s41598-025-98331-2
- Jan 7, 2026
- Scientific reports
- Lingdong Meng + 7 more
During the injection and withdrawal of natural gas, faults may cause lateral leakage, resulting in the loss or migration of natural gas out of the gas storage area. Therefore, the lateral sealing property of faults is crucial for the safe operation of gas storage facilities. This paper uses the L gas storage reservoir as a case study to conduct a statistical analysis of the fault dip, reservoir thickness, and overburden thickness within the faulted structure. It integrates a fault sealing triangle diagram derived from logging data to ascertain that the gas storage facility in this study primarily depends on lithological contact sealing and fault rock sealing mechanisms. Furthermore, it evaluates the sealing capacity of these confining faults and develops a quantitative model for assessing their lateral sealing capacity based on an anatomical examination of the original gas reservoir. Through the analysis of dynamic development data for the safe operation of gas storage facilities, pressure variations on both sides of the faults during different injection and production phases were systematically identified. The pressure differential at the end of production was selected, and a numerical simulation incorporating time effects was conducted to assess the dynamic sealing capacity of the fault. A model representing fault sealing capability based on this dynamic development data was established, which elucidates the sealing mechanisms present on either side of the fault across various periods and identifies factors (fluid pressure, tectonic stress, changes in fluid properties) contributing to pressure differentials. The model demonstrates 89% prediction accuracy through machine learning-assisted history matching of 12 injection-production cycles, significantly outperforming conventional methods by 32%. Additionally, the study discusses discrepancies in lateral sealing capacities among different stages and clarifies the fundamental reasons behind variations in pressure differences over time. These findings provide a robust theoretical foundation for assessing sealing capabilities in gas storage facilities during subsequent development phases.
- Research Article
- 10.3329/cerb.v24i1.86730
- Jan 6, 2026
- Chemical Engineering Research Bulletin
- Mohammad Asadullah + 2 more
This paper presents the results from simulation study with different development scenarios of Fenchuganj gas field of Bangladesh. It came into production in 2004 with only one well, which watered out after three years. It was then recompleted in a lower zone but soon water cut became too high, forcing a significant reduction in gas rate for sand free production. A second well is in production since 2005. Two more development wells were also under way. The need for simulation study was obvious at this point, which would provide insight to the production behavior, the state of depletion, and the possible effects of the development wells. The first simulation study was carried out in 2009. However, a second study was carried out later, which is the subject matter of this paper. For the second study, the geological model was revised and was validated by history matching. It reproduced the wellhead pressure and water production history of 7 years with reasonable accuracy. Thus reliability of the model was established, and predictive simulation was run for 25 years up to 2036. Five different development scenarios were simulated, which incorporated the existing wells as well as new wells. The results indicated highest recovery of about 81.75%, with six wells draining the three major sands. Chemical Engineering Research Bulletin: 24 (Issue 1): 70-78
- Research Article
- 10.3390/en19010270
- Jan 4, 2026
- Energies
- Jianxun Liang + 4 more
Characterizing and simulating complex reservoirs, particularly unconventional resources with multiscale and non-homogeneous features, presents significant bottlenecks in cost, efficiency, and accuracy for conventional research methods. Consequently, there is an urgent need for the digital and intelligent transformation of the field. To address this challenge, this paper proposes that the core solution lies in the deep integration of physical mechanisms and data intelligence. We systematically review and define a new research paradigm characterized by the trinity of digital cores (geometric foundation), physical simulation (mechanism constraints), and artificial intelligence (efficient reasoning). This review clarifies the core technological path: first, AI technologies such as generative adversarial networks and super-resolution empower digital cores to achieve high-fidelity, multiscale geometric characterization; second, cross-scale physical simulations (e.g., molecular dynamics and the lattice Boltzmann method) provide indispensable constraints and high-fidelity training data. Building on this, the methodology evolves from surrogate models to physics-informed neural networks, and ultimately to neural operators that learn the solution operator. The analysis demonstrates that integrating these techniques into an automated “generation–simulation–inversion” closed-loop system effectively overcomes the limitations of isolated data and the lack of physical interpretability. This closed-loop workflow offers innovative solutions to complex engineering problems such as parameter inversion and history matching. In conclusion, this integration paradigm serves not only as a cornerstone for constructing reservoir digital twins and realizing real-time decision-making but also provides robust technical support for emerging energy industries, including carbon capture, utilization, and sequestration (CCUS), geothermal energy, and underground hydrogen storage.
- Research Article
- 10.37745/bjesr.2013/vol14n14472
- Jan 1, 2026
- British Journal of Earth Sciences Research
- Edidiong Edidiong-Umoh
Mature deltaic and turbidite reservoirs represent critical hydrocarbon assets globally, yet their management is persistently challenged by declining production efficiency, rising subsurface uncertainty, and static geological models that progressively diverge from observed dynamic behavior. Conventional reservoir characterization workflows, including those structured within the Reservoir Management Maturity Model (RM3) framework, typically treat architectural element definitions—the fundamental building blocks of geocellular models—as fixed inputs established during field appraisal and preserved through subsequent model updates. This practice results in "frozen" geological frameworks that lose epistemic flexibility, leading to systematic static-dynamic mismatch, uncontrolled volumetric uncertainty, and suboptimal infill well placement decisions. The chronic failure to operationalize architectural reinterpretation as an active decision variable represents a critical gap in mature field value optimization methodology.This study presents and validates a novel, structured workflow that operationalizes reservoir architectural element re-definition as a formal decision-control mechanism within mature field static modeling practice. The methodology comprises five integrated stages: (1) baseline model audit identifying systematic performance anomalies symptomatic of architectural misconception; (2) data-driven reinterpretation integrating seismic geomorphology, sedimentological reanalysis, and production diagnostics to propose revised element boundaries; (3) static model re-population implementing revised architectural frameworks within geocellular constructs; (4) dynamic calibration discriminating between competing interpretations through history matching; and (5) decision-control formalism translating narrowed uncertainty into quantified infill well rankings and investment sanction criteria. The workflow is demonstrated through application to two anonymized offshore assets: a wave-influenced deltaic reservoir (Asset D, Niger Delta analogue) and a confined turbidite channel-lobe system (Asset T, deepwater Gulf of Mexico analogue), both characterized by 15–28 years production history and legacy static models constructed under initial appraisal-phase data constraints. Application of the workflow to Asset D achieved 60% reduction in Original Oil in Place uncertainty span (P90-P10 range narrowed from 91% to 36% relative to P50), 74% reduction in Connected Static Volume uncertainty for candidate infill locations, and 46% improvement in history match quality without geologically implausible parameter adjustments. Critically, architectural reinterpretation—distinguishing distributary channel from mouth bar elements using integrated seismic-core-dynamic evidence—directly enabled sanction of Well D-44, which was ranked 9th under legacy interpretation but elevated to 2nd rank under revised framework. Well D-44 delivered 3.21 MMstb cumulative production over 66 months, tracking within 6% of revised model forecasts and generating $18.2 million incremental net present value. Across both study assets, the workflow identified seven previously unrecognized infill opportunities, collectively representing 12.4 MMstb incremental accessible resources, with four wells drilled to date achieving average forecast accuracy within ±12%. This study demonstrates that systematic architectural element re-definition, conducted through disciplined integration of existing datasets rather than new data acquisition, functions as a powerful decision-control mechanism that narrows uncertainty, improves model predictiveness, and directly governs capital allocation confidence in mature clastic reservoirs. The methodology transforms static geological models from passive knowledge repositories into active decision-control systems, providing transferable value to hydrocarbon portfolio optimization and emerging subsurface energy transition applications including CO₂ storage site characterization and geothermal resource assessment.
- Research Article
- 10.2118/0126-0011-jpt
- Jan 1, 2026
- Journal of Petroleum Technology
- Chris Carpenter
_ This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 221978, “Development of Fast Predictive Models for CO2 Enhanced Oil Recovery and Storage in Mature Oil Fields,” by Yessica Peralta, Ajay Ganesh, and Gonzalo Zambrano, SPE, University of Alberta, et al. The paper has not been peer-reviewed. _ Reservoir modeling tools have played a significant role in designing subsurface fluid-injection methods such as CO2 enhanced oil recovery (EOR). However, these models are computationally expensive, requiring extensive geological and engineering data that often are not available in the early phases of carbon use and storage projects. This work presents the development of fast predictive models and optimization methodologies to evaluate CO2 EOR and storage operations quickly in mature oil fields. Weyburn-Midale CO2 EOR Reservoir Model Description. The Weyburn oilfield is in southern Saskatchewan, Canada. Weyburn oil reserves are within a thin zone of fractured carbonates (maximum thickness of 30 m) deposited in a shallow carbonate shelf environment at a depth of 1350–1450 m. The reservoir consists of two main units, the upper Marly dolostone (thickness ranging from 0 to 10 m) and the lower vuggy limestone (thickness ranging from 0 to 20 m). Oil production began in 1956. CO2 miscible-flooding EOR was initiated in 2000, alternating with water in some wells to improve oil-recovery efficiency and to store CO2 for the long term. The Weyburn-Midale CO2 EOR model in this work is a subarea of the Phase 1A monitoring and storage project. It was developed by using a commercial compositional reservoir simulator. History matching was performed by using 216 well histories (producers and injectors) from April 1964 to the end of 2006. Numerical-Grid Construction. The total model dimensions are 7000, 7800, and 30 m, corresponding to the model width, length, and thickness, respectively. The number of gridblocks is 141×280×27, conforming to 1,065,960 total gridblocks. The reservoir thickness is approximately 30 m. The 27 vertical layers of the Weyburn-Midale CO2 EOR grid model are distributed as eight layers of marly dolostone (upper layers), four layers of vuggy intershoal, seven layers of vuggy shoal, and eight layers of vuggy lower shoal. Reservoir and Fluid Properties. The Weyburn-Midale reservoir is an anisotropic heterogeneous fractured reservoir. The marly unit is chalky intertidal dolostone with some interbeds of limestone with porosity ranging from 16 to 38%. The matrix permeability ranges from 1 to more than 100 md. The vuggy zone constitutes a heterogeneous subtidal limestone with varied diagenetic and depositional environments, resulting in porosity values from 3 to 18%. The matrix permeability varies from less than 0.01 to more than 500 md, where fractures control the direction and the magnitude of permeability anisotropy. The reservoir features an initial temperature of 63°C and a reference pressure of 15.2 MPa at a depth of 1440 m. The original oil and water in place are 40.16 MRm3 and 21.67 MRm3, respectively. The model’s total pore volume is 61.29 MRm3, with an initial fluids saturation of 65% oil and 35% water.
- Research Article
- 10.1155/jge5/6694015
- Jan 1, 2026
- Journal of GeoEnergy
- Shitan Yin + 3 more
Production forecasting for oil and gas wells is a decisive element of field‐development planning because it directly guides recovery strategy design, production optimisation and risk management. Conventional methods, including empirical decline‐curve analysis (DCA) and full‐physics numerical simulation, are limited either by their inability to capture complex non‐linear flow behaviour or by prohibitive computational requirements. The rise of big data and artificial intelligence has introduced machine learning models such as support vector regression (SVR), random forests (RFs), XGBoost and multi‐layer perceptrons (MLPs), whose efficient non‐linear fitting has improved predictive accuracy; however, their black‐box nature and weak physical consistency now constrain further progress. Modern deep learning (DL) architectures—including long short‐term memory (LSTM) or gated recurrent unit (GRU) networks, CNN–LSTM hybrids, transformers, graph convolutional networks (GCNs) and kernel adaptive networks—extend modelling capability to long temporal sequences and systems with multiple interacting wells, fostering a technical shift from purely data‐driven learning toward physics‐enhanced intelligence. Of particular note, physics‐informed neural networks (PINNs) embed Darcy flow equations and related constraints directly in the loss function, which markedly strengthens extrapolation ability and interpretability while offering efficient support for history matching, surrogate modelling and closed‐loop reservoir management (CLRM). Nevertheless, these networks still face challenges involving the balance of loss‐term weights, multi‐scale coupling and training convergence; progress will rely on dynamic weighting schemes and a standardised library of physical priors. This review, therefore, synthesises the evolution from traditional machine learning to physics‐constrained approaches in production forecasting, assesses their respective advantages and limitations and identifies future research priorities in high‐quality dataset construction, cross‐field transfer learning, interpretability enhancement and system‐level intelligent optimisation in order to realise fully digital and closed‐loop intelligent oilfields.
- Research Article
- 10.3390/en19010163
- Dec 27, 2025
- Energies
- Yifan He + 6 more
As a critical parameter for describing oil–water two-phase flow behavior, relative permeability curves are widely applied in field development, dynamic forecasting, and reservoir numerical simulation. This study addresses the issue of relative permeability anisotropy, focusing on the seepage characteristics of two typical bedding structures in sandstone reservoirs—tabular cross-bedding and parallel bedding—through multi-directional displacement experiments. A novel anisotropic relative permeability testing apparatus was employed to conduct displacement experiments on cubic core samples, comparing the performance of the explicit Johnson–Bossler–Naumann (JBN) method, based on Buckley–Leverett theory, with the implicit Automatic History Matching (AHM) method, which demonstrated superior accuracy. The results indicate that displacement direction significantly influences seepage efficiency. For cross-bedded cores, displacement perpendicular to bedding (Z-direction) achieved the highest displacement efficiency (75.09%) and the lowest residual oil saturation (22%), primarily due to uniform fluid distribution and efficient pore utilization. In contrast, horizontal displacement exhibited lower efficiency and higher residual oil saturation due to preferential flow path effects. In parallel-bedded cores, vertical displacement improved efficiency by 18.06%, approaching ideal piston-like displacement. Microscale analysis using Nuclear Magnetic Resonance (NMR) and Computed Tomography (CT) scanning further revealed that vertical displacement effectively reduces capillary resistance and promotes uniform fluid distribution, thereby minimizing residual oil formation. This study underscores the strong interplay between displacement direction and bedding structure, validating AHM’s advantages in characterizing anisotropic reservoirs. By integrating experimental innovation with advanced computational techniques, this work provides critical theoretical insights and practical guidance for optimizing reservoir development strategies and enhancing the accuracy of numerical simulations in complex sandstone reservoirs.
- Research Article
- 10.54691/7hqy9687
- Dec 20, 2025
- Scientific Journal of Technology
- Haotong Guo + 4 more
Reservoir model parameter inversion (history matching) is a crucial step in oil and gas field development for reducing the uncertainty in reservoir description and improving the accuracy of production prediction. However, traditional methods face issues such as non-uniqueness of solutions and high computational cost. In recent years, advancements in artificial intelligence (AI) technology, particularly deep learning methods, have provided new solutions to these problems. This paper conducts a systematic review of AI-based techniques for reservoir model parameter inversion, including end-to-end inversion methods (Convolutional Neural Networks), Generative Adversarial Networks, Physics-Informed Neural Networks, proxy model-accelerated inversion techniques, and emerging methods like Reinforcement Learning and Meta-learning. It analyzes the principles, advantages, limitations, and applicable scenarios of various methods, investigates key challenges such as data scarcity, uncertainty quantification, physical consistency, and model interpretability, and proposes corresponding solutions. Predictions are made regarding future directions, including the deep integration of physics and AI, efficient uncertainty quantification, and industrial-scale application. This paper aims to serve as a reference for research and application in this field, promoting the use of AI-driven reservoir parameter inversion technology in the digital and intelligent transformation of the oil and gas industry.
- Research Article
- 10.54097/jzxdmx11
- Dec 10, 2025
- Mathematical Modeling and Algorithm Application
- Haoxi Shi + 5 more
Reservoir history matching is a crucial process in oilfield development, aiming to calibrate geological models, characterize reservoir parameters, and enhance predictive accuracy. Traditional manual history matching relies heavily on human expertise, extensive computational effort, and subjective judgment, which can no longer meet the demands posed by increasingly complex geological structures and massive data volumes. In recent years, data-driven and machine learning approaches have shown great potential in reservoir inversion and production forecasting. Emerging models such as proxy models, deep neural networks, graph neural networks (GNNs), and Transformers provide new paradigms for achieving automatic reservoir history matching. This paper reviews the historical evolution of history matching methods, tracing their progression from early numerical simulation and optimization algorithms to modern data-driven modeling techniques. From the perspective of machine learning, it further investigates the performance of these methods in dynamic reservoir prediction and discusses the application prospects of spatio-temporal fusion models based on GNNs and Transformers. Finally, the paper outlines the current challenges and future directions in this field, aiming to contribute to the advancement of intelligent reservoir modeling and the development of digital twin reservoirs.