Articles published on monte-carlo-simulations
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
210482 Search results
Sort by Recency
- New
- Research Article
- 10.1108/ci-07-2025-0315
- Mar 10, 2026
- Construction Innovation
- Kailash Choudhary + 4 more
Graphical abstract Source: Authors’ own work A conceptual schematic compares sand and Z O T use for foundations, showing strength gain, carbon dioxide reduction, sustainability, and a cross-waste return to a tailing site. The schematic presents the concept of foundation construction using sand and Z O T material. Sand is shown linked to a foundation grid. Z O T is shown as a recovered resource used for foundation construction and transported by truck. A check mark indicates suitability for foundation use. An upward arrow indicates increased strength. A downward arrow with C O 2 indicates reduced carbon dioxide. Symbols representing environmental sustainability and stakeholder interest are shown above the Z O T scheme. An arrow from Z O T toward a tailing waste site is marked with a cross symbol. The tailing waste site is labelled waste and marked hazardous with a warning symbol. A legend lists foundation, environmental sustainability, stakeholder interest, resource recovery, tailing waste site, and hazardous. Purpose This study aims to assess and compare the environmental impacts of using zinc tailings versus sand as structural filling material. The study also aims to check the effect of transportation distance on the environmental impacts generated by zinc tailings. Design/methodology/approach This study uses a three-step methodology in accordance with the life cycle assessment framework. Environmental modeling of both systems has been done using the OpenLCA software. The results are presented in 11 midpoint and three endpoint environmental impact categories. The study has used primary data from a real-world case study in India and secondary data from the Eco-Invent 3.9.1 database. Monte Carlo simulation has been used to explore the sensitivity of results to transportation distances. Findings The analysis reveals significant environmental advantages of zinc tailing utilization all midpoint and endpoint categories, including a 94% reduction in climate change impact and a 52% decrease in terrestrial acidification compared to sand foundations. Findings indicate that even with varying transport distances, zinc tailing use remains environmentally beneficial overall. Research limitations/implications This study does not compare the economic and social impacts of this repurposing of mining waste. Originality/value This research contributes to the growing body of knowledge on sustainable mining practices and circular economy principles in construction. It demonstrates the potential of repurposing mining waste to address environmental challenges while providing an alternative construction material.
- New
- Research Article
- 10.1021/acs.est.5c18261
- Mar 10, 2026
- Environmental science & technology
- Lei Zhang + 1 more
The rapid proliferation of incineration plants has rendered their carbon emissions a substantial contributor to urban carbon emissions. Municipal solid waste classification presents waste-to-energy plants with challenges and opportunities. Monitoring nine Shenzhen incineration plants (2018-2022), we employed Monte Carlo simulations for sensitivity analysis and constructed response surface models to derive quantitative emission reduction pathways. Results indicate the plastic incineration proportion rose to 27.6%, increasing direct emissions and carbon substitution credits from electricity. Reducing plastic by 1% cuts emissions by approximately 17.38 kg CO2-eq/t. Plastic should maintain below 34% considering synergistic effects among dominant emission determinants: plastic composition, grid carbon intensity, and electricity output. Singular interventions targeting plastic reduction or energy efficiency improvements have limited mitigation potential. A comprehensive mitigation scenario combining efficiency improvement, combined heat and power, plastic waste reduction, enhanced recycling, and Bioplastic substitution enables net negative emissions under future low-carbon grids. This study highlights that, systemic transformation from carbon source to carbon sink necessitates coordinated actions across multistakeholder.
- New
- Research Article
- 10.1097/hp.0000000000002119
- Mar 9, 2026
- Health physics
- Takahiro Kitajima + 3 more
Following the 2011 accident at the Fukushima Daiichi Nuclear Power Plant, simplified thyroid screening was carried out in children to determine internal exposure to radioactive iodine. Because NaI(Tl) scintillation survey meters - originally designed for environmental radiation - were used, quantifying radioiodine accumulated in the thyroid involved considerable uncertainty. This study investigated factors impacting measurements, specifically thyroid size (volume) and the contribution of short-lived radionuclides other than ¹³¹I (132I, 133I, and 135I) using Monte Carlo simulations with the PHITS code. For 1-y-old children, comparison between standard and minimal thyroid volumes showed that volume differences combined with variations in soft tissue thickness could cause the thyroid equivalent dose to differ by up to 76.5%. Furthermore, considering short-lived radionuclides revealed that at a ¹³¹I thyroid-equivalent dose of 100 mSv, the actual dose could reach 131.7 mSv. Thus, a screening level was used in the simplified method to effectively detect ¹³¹I. As individual differences in thyroid volume exist even among children of the same age, contributions from radionuclides other than ¹³¹I may not be negligible, impacting screening level use. The results suggest that to assess thyroid radiation dose more accurately, it is necessary to consider both age and individual thyroid volume differences, which can cause measurement errors up to 76.5%. Regarding dose contribution from short-lived radionuclides, estimates based on relative proportions of radioactive materials adhered to evacuees' clothing at the time of the accident indicated a maximum impact of 31.7%.
- New
- Research Article
- 10.1021/acs.jctc.5c01657
- Mar 9, 2026
- Journal of chemical theory and computation
- Craig Daniels + 3 more
Kinetic Monte Carlo (KMC) simulations are broadly used to investigate chemical and materials systems where a balance between atomic detail and diffusion or reaction time scales is needed. Here we present KinCat, an open-source 2D KMC package designed for use in lattice-KMC studies of surface kinetics in heterogeneous catalytic systems. It is written in C++ and uses Kokkos to facilitate use on a variety of shared-memory CPU/GPU/accelerator systems. We demonstrate the performance scaling of KinCat on GPU and CPU architectures, using CO oxidation on RuO2 as a model system. KinCat efficiently manages large lattice KMC simulations using a parallel domain-decomposition algorithm.
- New
- Research Article
- 10.1007/s44216-026-00073-z
- Mar 9, 2026
- Asian Review of Political Economy
- Dwayne Woods
Abstract Rare earth elements (rees) are abundant in the Earth’s crust but difficult to access technologically. This creates a strategic imbalance that China has exploited not just through dominance in production but also by maintaining deliberate opacity. This paper introduces a dynamic signaling model in which nondisclosure of quotas, patents, and regulations serves as a recurring tactic that increases uncertainty, prompts cautious hedging, and results in unpredictable timelines for diversification among rival countries. Static costly-signaling models cannot fully describe this ongoing process; instead, pooling equilibria based on nondisclosure prevail, supported by super-modular effects through various layers of opacity. Monte Carlo simulations show that opacity can halve the likelihood of reaching resilience within 15 years and broaden the range of possible diversification timelines, shifting expected completion times by more than 4 years. Policy scenarios suggest that transparent procedures are most effective in enhancing resilience, while excessive opacity can backfire by causing panic-driven diversification.
- New
- Research Article
- 10.1063/5.0308542
- Mar 9, 2026
- The Journal of chemical physics
- Rui Zhao + 5 more
Sulfur hexafluoride (SF6) is particularly important for purification and recovery in environmental management and resource optimization due to its extremely high global warming potential and widespread industrial applications. In this study, we employed grand canonical Monte Carlo simulations and density functional theory calculations to investigate heteroatom (C, N, and O) functionalization modifications of covalent organic framework (COF)-637, aiming to evaluate its selective adsorption performance for SF6/N2 mixtures. The results indicate that at 298K and 1bar, the selectivity values of heterocyclic COF-2O and COF-2N for SF6/N2 (10:90 v:v) mixtures are 400.79 and 353.57, respectively. The introduction of heteroatoms effectively enhances the selective separation performance of the original framework. By analyzing the adsorption isotherms of SF6 in the mixed components within the framework at pressures ranging from 0.1 to 1bar, it is confirmed that heterocyclic modification can effectively enhance the selective capture of SF6. Further analysis, including charge difference density, Bader charges, and the independent gradient model for weak interactions, reveals the interactions between SF6 and the framework. This study provides theoretical support for understanding the adsorption mechanisms of COFs and for designing highly efficient materials for SF6 purification.
- New
- Research Article
- 10.3390/stats9020027
- Mar 7, 2026
- Stats
- Vasileios Papadopoulos + 1 more
Anatomical variants are observed on paired body sides, yet many prevalence studies—particularly those based on osteological collections—report only right- and left-side frequencies without specifying whether findings occur bilaterally in the same individual. In such cases, the individual-level left–right structure is unobserved. Consequently, inference on laterality and bilateralism cannot be based on the reported data alone and must rely on explicit assumptions about within-individual dependence. We study this problem in the context of anatomic prevalence data, although the framework applies more broadly to paired binary outcomes. We parameterize the admissible joint distributions using a feasibility-based dependence index λ, spanning the full range from independence to maximal feasible concordance implied by the marginal prevalences. Within this framework, we examine two complementary estimands: the paired odds ratio for laterality and bilateral prevalence. Analytic results and Monte Carlo simulations show that bilateral prevalence varies linearly and remains stable across the admissible dependence range, whereas the paired odds ratio exhibits intrinsic boundary instability as dependence approaches its feasible maximum due to vanishing discordant counts. Uncertainty-propagation analyses further indicate that laterality inference is robust to moderate misspecification of the dependence assumption. These results demonstrate that unobserved within-subject dependence is a structural inferential issue in paired binary meta-analysis and motivate feasibility-based sensitivity analysis when only marginal data are available.
- New
- Research Article
- 10.1063/5.0316275
- Mar 7, 2026
- The Journal of chemical physics
- Harold W Hatch
Parallelization of Monte Carlo (MC) is required to observe the same growth as molecular dynamics because computer processor clock speeds have plateaued while the number of cores has increased. Although prefetch parallelization can speed up an Monte Carlo molecular simulation by a factor of 3 using four parallel threads for simultaneous single-particle displacements in the canonical ensemble, other ensembles require multiple trial types that impact efficiency when threads wait for the other threads with more time-consuming trials, such as volume changes or particle insertions and deletions in the isothermal-isobaric, grand canonical, and Gibbs ensemble. Load balancing increases efficiency by attempting the same trial in each thread of a parallel batch but violates detailed balance if done incorrectly. By computing standard deviations as a function of processor time, efficiency is systematically investigated over a variety of ensembles, load balancing algorithms, and trial attempt and acceptance probabilities for dense liquids of Lennard-Jones and an extended simple point charge model of water, to reveal numerous efficiency gains, including in serial simulations. Parallel efficiency in these ensembles approached the theoretical maximum by reducing overhead costs with improved algorithms and data structures released in the open-source Monte Carlo software called FEASST.
- New
- Research Article
- 10.1002/dac.70462
- Mar 6, 2026
- International Journal of Communication Systems
- Aryan Bansal + 3 more
ABSTRACT In this paper, we present a cooperative non‐orthogonal multiple access (CoNOMA) communication system featuring two source users, two destination users, and a decode‐and‐forward relay. Using a single relay, both sources communicate to their respective destination using the NOMA scheme. Further, to improve the user's performance, both the direct and the relayed links at the destination are combined using the maximal ratio combining technique. To assess the robustness of our proposed framework, we formulate mathematical equations for the outage probability of the end users. To consider a more realistic scenario, practical constraints like hardware impairments (HI), imperfect channel state information (imCSI), and imperfect successive interference cancellation (imSIC) are considered at the destination node. We have presented the proposed system's average bit error rate performance to prove its effectiveness over existing systems. The analytical results have been validated through Monte Carlo simulations. Our work contributes to advancing the understanding and reliability of cooperative NOMA systems, offering insights that can be pivotal for designing future communication networks.
- New
- Research Article
- 10.1128/aac.01033-25
- Mar 4, 2026
- Antimicrobial agents and chemotherapy
- Hailan Wu + 12 more
Contezolid is a novel oxazolidinone antibiotic for the treatment of gram-positive bacteria, which are one of the most common pathogens of pneumonia. We conducted a prospective, single-center, open-label study to evaluate the clinical and microbiological efficacy, safety profile, and pulmonary epithelial lining fluid (ELF) penetration characteristics of contezolid in adult pneumonia patients. Sparse blood samples and bronchoalveolar lavage fluid samples were collected from patients after multiple oral doses of 800 mg of contezolid twice a day. Pharmacokinetic parameters were calculated by developing population pharmacokinetic (PopPK) modeling, and probability of target attainment was evaluated by Monte Carlo simulations. The study enrolled 15 patients (mean age 55 years) with primarily community-acquired pneumonia. Contezolid achieved a clinical cure rate of 80.0% and a bacterial clearance rate of 71.4%. Oral contezolid was well tolerated, and no drug-related adverse effects were observed in any of the subjects. The mean area under the concentration-time curve (AUC₀-₁₂,ss) was estimated by the PopPK model to be 33.06 mg·h/L in ELF and 71.95 mg·h/L in plasma. Assuming a plasma protein binding rate of 90% based on literature data, the ELF-to-free plasma AUC0-12,ss ratio was 4.50. When the minimum inhibitory concentration was ≤4 mg/L, 800 mg of contezolid q12h could achieve the optimal therapeutic target in the plasma of patients with pneumonia. This study demonstrates that contezolid achieved excellent pulmonary penetration in adult patients with pneumonia.
- New
- Research Article
- 10.3389/frai.2026.1795842
- Mar 4, 2026
- Frontiers in Artificial Intelligence
- Alexander Borg + 12 more
Objective To explore whether an AI-enhanced social robotic virtual patient (VP) platform reinforces empathetic behaviour patterns in medical students compared with a traditional computer-based platform. Methods Twenty-three sixth-semester medical students from Karolinska Institutet participated in semi-structured interviews following VP encounters with the Social AI-enhanced Robotic Interface (SARI) and, as a comparator, the computer-based Virtual Interactive Case system (VIC). Additionally, 178 students evaluated the VP platforms in empathetic training quantitatively using categorical nominal variables and a visual analogue scale (VAS), with a score of 0 indicating full preference for SARI and 10 full preference for VIC. Interview data were thematically analysed, and quantitative preferences were compared using the Fisher’s exact test with Monte Carlo simulation and the Wilcoxon signed-rank test. Results Thematic analysis yielded five major themes wherein students consistently reported that SARI facilitated greater empathetic engagement through multimodal interaction, ability to express emotions, and real-time communication adaptability. Quantitative analysis demonstrated a higher preference for SARI versus VIC (78% versus 6%; OR: 190.4; 95% CI: 76.8–472.0; p < 0.001), which remained consistent across subgroups of interest, i.e., female and male students, with and without prior experience in VPs, and students first exposed to SARI or first exposed to VIC. VAS data also showed a preference for SARI versus VIC (median: 2.00; IQR: 1.00–4.00; W : 738.5; r : 0.70; p < 0.001). Conclusion Our AI-enhanced social robotic VP platform was superior to a traditional computer-based VP platform in fostering empathetic engagement in medical students through enhanced authenticity and interactivity, supporting its potential to supplement clinical rotations.
- New
- Research Article
- 10.3390/w18050612
- Mar 4, 2026
- Water
- Gaiqiang Yang + 6 more
Climate change and growing water scarcity necessitate that irrigation districts allocate limited water resources more efficiently, with explicit consideration of multi-source uncertainties. To maximize the effective utilization coefficient of irrigation water, an uncertainty-informed optimization and dynamic regulation framework for agricultural water allocation (UODRA) was developed. The framework quantifies and characterizes uncertainties arising from meteorological forcings, soil heterogeneity, irrigation practices, and water losses during conveyance and field application. The fractional programming model derived therefrom is solved via Dinkelbach’s algorithm, and Monte Carlo simulation is adopted in a reduced scenario space to propagate the dominant uncertainty drivers and assess the distribution characteristics of outcomes and associated risks. A case study was conducted in the Fendong Irrigation District to evaluate three water supply scenarios. The results indicate that with sufficient water supply and diminishing marginal returns, the effective utilization coefficient of irrigation water increases accordingly. Uncertainty mainly exerts an impact on the degree of dispersion and downside risks rather than at the average level. Sensitivity analysis shows that efficiency-related perturbations are the primary drivers of output variability, and their impacts are greater than those of supply-side perturbations and demand-side variation in simulated irrigation demand. Further technical comparison reveals that the adoption of high-efficiency irrigation can significantly improve the performance at the regional level: under drip irrigation conditions, the efficiency reaches 0.614, while that of sprinkler irrigation is 0.499, with a simultaneous improvement in operational stability. Overall, UODRA provides a quantitative decision support method for robust irrigation water resource allocation and adaptive management under uncertain conditions.
- New
- Research Article
- 10.3390/aerospace13030244
- Mar 4, 2026
- Aerospace
- Seong-Hyeon Jo + 1 more
High-agility spacecraft require time-efficient attitude maneuvers under strict actuator- and system-driven saturation limits on angular rate and angular acceleration. Analytical methods for attitude profile generation are attractive for on-board use because of their deterministic structure and low computational burden; however, depending on boundary conditions and sequential constraint-enforcement logic, they may yield either infeasible commands that violate constraints or overly conservative commands that underutilize available authority and unnecessarily prolong maneuver time. In contrast, numerical optimization-based methods can produce (near-)minimum-time solutions but are often too iterative and tuning-sensitive for real-time deployment. The proposed method produces an iteratively refined closed-form solution. The inner loop yields a closed-form solution for a given set of parameters, while the outer loop updates the parameter set via an iterative rescale step. The resulting finite-jerk (jerk-limited) profiles are intended for use in a feedforward–feedback architecture to mitigate terminal mismatch induced by quaternion-kinematics linearization and acceleration-related variable mappings. Numerical studies evaluate the proposed method using representative single-case examples and Monte Carlo simulations with comparisons against a baseline analytical method and a numerical optimization-based method. These results indicate that the proposed approach substantially improves feasibility and optimality such that it achieves maneuver times close to those of numerically optimized solutions, while maintaining a semi-closed-form structure.
- New
- Research Article
- 10.37868/hsd.v8i1.1793
- Mar 4, 2026
- Heritage and Sustainable Development
- Yuri Chernenko + 1 more
Critical-infrastructure project appraisal often undervalues resilience investments because many disruption impacts are non-market in origin yet material for stakeholders. This study proposes the tiered tri?phase resilience valuation framework plus (TRVF+), which integrates three operational resilience dimensions – preparedness, continuity maintenance, and restoration – into a CBA-compatible decision metric. TRVF+ computes normalized indices (PPI, CMI, SRI/SRI_adj) and a composite social stability index (CSSI); monetizes avoided disruption impacts (ADIC) from complaints, trust/satisfaction proxies, and regulatory standing; and propagates parameter uncertainty via Monte Carlo simulation (10,000 runs per case) to estimate a resilience-adjusted benefit–cost ratio (RABCR) and the decision-robustness probability Pr(RABCR>1.0). The framework is demonstrated on four anonymized pilot cases representing increasing data maturity (Tier 1–3). Monetized intangible benefits account for ~15–30% of total benefits on average (up to ~40% within Tier?1 uncertainty bounds). Across cases, baseline BCR values of 0.88–1.15 increase to post?intervention RABCR values of 1.10–1.45, and decision robustness meets or exceeds a moderate acceptance rule (Pr(RABCR>1.0)?0.80; observed range 0.81–0.95). TRVF+ enables auditable valuation of resilience and supports communication of uncertainty in stakeholder consultations and public hearings.
- New
- Research Article
- 10.1021/acs.inorgchem.6c00337
- Mar 4, 2026
- Inorganic chemistry
- Jia-Yao Liu + 5 more
The efficient removal of ethane (C2H6) and propane (C3H8) from natural gas is vital for purification. A synergistic pore engineering integrating pore space partition and fluorine functionalization in metal-organic frameworks (MOFs), which may effectively promote the C-H···π and C-H···F interactions for effective methane separation. This strategy was validated using two fluorine-functionalized pore-space-partitioned MOFs (SNNU-707/-708) constructed by introducing varying numbers of -CF3 groups on the pore surface. Single-component adsorption isotherms show high adsorption of SNNU-707/-708 for C2H6 and C3H8 were 94.9/63.6 cm3 g-1 and 96.4/68.9 cm3 g-1, significantly exceeding that of CH4 (18.9/13.4 cm3 g-1). Ideal adsorbed solution theory (IAST) indicated high selectivity values of 85.2/116.6 for C3H8/CH4 (50/50) and 16.7/17.0 for C2H6/CH4 (50/50). Notably, the actual breakthrough interval times of SNNU-707 for C3H8/CH4 (5/95) and C2H6/CH4 (10/90) can reach 502 and 78 min·g-1 and yield high-purity CH4 (>99.5%) at 5.89 mmol g-1 from ternary mixtures. Grand Canonical Monte Carlo (GCMC) simulations attribute this performance to synergistic weak interactions (C-H···π, C-H···F, C-H···O/N) between MOF and alkane. Specially, thanks to the fluorine-functionalized pore environments, both MOFs maintain structural integrity and separation performance under harsh conditions up to 98% relative humidity, which is crucial for practical wet natural gas separation.
- New
- Research Article
- 10.1525/elementa.2025.00089
- Mar 4, 2026
- Elem Sci Anth
- Jinya Wang + 4 more
Methane-mitigation policies increasingly rely on simulation tools to design and evaluate Leak Detection and Repair (LDAR) programs, yet the extent to which these models reproduce real-world emissions remains unclear. This study provides the first direct, data-driven evaluation of 2 widely used open-source LDAR simulators, Fugitive Emissions Abatement Simulation Toolkit (FEAST) and the Leak Detection and Repair Simulator (LDAR-Sim), using a comprehensive regulatory dataset from the British Columbia Energy Regulator (BCER) covering 2020–2023. To ensure methodological alignment, simulations were configured to match empirical survey frequencies, optical gas imaging (OGI)-detection conditions, and BCER-calibrated leak generation parameters. Annual methane emissions from each model were compared with observed OGI-detectable emissions, and model behavior was further assessed through stratified Monte Carlo simulations, bootstrap aggregation, and targeted sensitivity analyses. Both models systematically underestimated annual emissions and did not reproduce the non-monotonic interannual patterns observed in the BCER records, especially the 2021 peak. The sensitivity analyses indicate that the realism of the repair process, including the timing of repairs and the occurrence of incomplete or delayed repairs, is the primary factor governing model accuracy. This influence is stronger than that of detection thresholds or survey frequency. LDAR-Sim demonstrated greater robustness because it samples repair delays from empirical distributions, while FEAST exhibited higher volatility when subjected to realistic repair behavior. These findings highlight key structural limitations in current LDAR simulation frameworks and underscore the need for improved representation of repair compliance. The results provide a transparent benchmark for LDAR model evaluation and offer guidance for enhancing simulation fidelity to support methane-mitigation policy design.
- New
- Research Article
- 10.60923/issn.1973-2201/18579
- Mar 3, 2026
- Statistica
- Bilal Ahmad Peer + 1 more
We consider data modelling under one inflation for zero-truncated count data, as they typically arise in capture-recapture modelling. One-inflation in zero-truncated count data has recently found considerable attention. In this regard, zero-truncated New Discrete distribution and a distribution to a point mass at one are used to create a one-inflated model namely one-inflated zero-truncated New Discrete distribution. Its reliability characteristics, generating functions, and distributional properties are investigated in some detail. which includes survival function, hazard rate function, probability generating function, characteristic function, variance, skewness, and kurtosis. Monte Carlo Simulation have been undertaken to evaluate the effectiveness of the maximum likelihood estimators. To test the compatibility of our proposed model, the baseline model and the proposed model are distinguished by using the two different test procedures. The adaptability of the suggested model is demonstrated using two real-life datasets from separate domains by taking various performance measures into consideration.
- New
- Research Article
- 10.2507/ijsimm25-1-co4
- Mar 3, 2026
- International Journal of Simulation Modelling
- J S Zhang + 1 more
Robust Scheduling under Disruptions Using Transformers and Monte Carlo Simulation
- New
- Research Article
- 10.1139/cgj-2025-0688
- Mar 3, 2026
- Canadian Geotechnical Journal
- Zhengwei Li + 3 more
Although slope failures are inherently three-dimensional (3D) and soils exhibit spatial variability, studies that consider both factors remain limited. To address this issue, an efficient framework is proposed in this work for probabilistic evaluation of 3D slope stability with spatially variable soil properties. The framework couples a 3D convolutional neural network (CNN) with Monte Carlo simulation (MCS) for reliability assessment. Spatial variability in soil strength is modeled using random fields, with realizations obtained through the fast Fourier transform method. A deterministic solver based on discretized limit analysis (DLA) is implemented to evaluate 3D slope stability in spatially heterogeneous soils. A limited number of random field samples is generated, and the associated slope stability responses are evaluated with the DLA-based deterministic solver. The resulting paired input-output data constitute the training set for a 3D CNN, which can then estimate slope stability for new random field realizations. The trained CNN surrogate enables large-scale MCS for probabilistic slope stability analysis with high computational efficiency. Model performance is assessed at both the deterministic solver level and the CNN level. Overall, the proposed DLA-CNN coupling provides an efficient approach for 3D slope reliability analysis, enabling accurate probabilistic evaluation with markedly reduced computational cost
- New
- Research Article
- 10.3390/app16052436
- Mar 3, 2026
- Applied Sciences
- Adewale Amosu + 7 more
The Panoma Field in the Hugoton Embayment, Kansas, has produced significant gas resources from thousands of wells perforating the Permian Chase and Council Grove Groups. Variability in gas production from these formations is controlled by facies-influenced petrophysical properties. The use of geological facies data in numerical modeling is often limited to delineating regions of interest without intrinsic use in estimating petrophysical properties. Machine learning provides opportunities to integrate facies data into the numerical model-building process. In this study, we employ facies data in optimizing a numerical model permeability matrix scaling parameter using Monte Carlo Simulation of Markov Switching Dynamic Regression and machine learning. Realizations of the scaling parameter are included in a machine learning facies prediction workflow to identify the parameter that maximizes facies prediction accuracy, with test accuracy as high as 83%. A 3D numerical model was constructed to represent the interlayered carbonate, shale, and non-marine sandstones facies typical of the Council Grove intervals. Multiple field development and completion scenarios were evaluated to maximize cumulative gas recovery and assess the role of facies distribution on reservoir performance. History matching results of historical gas production demonstrate strong coupling between facies distribution and the optimized permeability, emphasizing the importance of facies data integration in reservoir property modeling and gas production estimation in Permian reservoirs. This implies that probabilistically constrained permeability scaling using the Monte Carlo and machine learning workflow produces more realistic modeling compared to traditional approaches.