Published in last 50 years
Articles published on Uncertainty Analysis
- New
- Research Article
- 10.1177/09544089251390138
- Nov 6, 2025
- Proceedings of the Institution of Mechanical Engineers, Part E: Journal of Process Mechanical Engineering
- Adhindra Vs + 3 more
This study investigates the mechanical behavior of 3D-printed layered composites composed of thermoplastic polyurethane and polylactic acid, bonded through a geometric interlocking process. This approach is intended for static-load applications such as orthotic devices, consumer electronics, and automotive interiors, where long-term cyclic loading is not a primary concern. Such applications demand strong interfacial bonding and dimensional stability under single or low-cycle mechanical loads. The objective was to enhance interfacial adhesion using a slit-type penetration pattern and to evaluate the effects of postprint heat treatment on mechanical performance. Five composite specimens were subjected to heat treatment at temperatures ranging from room temperature to 110°C, under controlled pressure via a custom-built mold that ensured uniform heat distribution around the interlock region. Scanning electron microscopy was used to analyze porosity variations, and tensile testing was conducted to assess load–displacement behavior and elastic modulus. An uncertainty analysis was performed to quantify variability in peak load and displacement. The results showed that porosity decreased with increasing heat treatment temperature, and mechanical strength followed a bell-shaped trend, peaking at 80°C. Excessive heating compromised interlock integrity, leading to reduced mechanical performance. Fractographic analysis revealed distinct failure modes across temperature conditions, highlighting the influence of heat treatment on interfacial bonding. The uncertainty study validated the reliability of the mechanical data. Overall, the findings offer valuable insights into optimizing strength in multimaterial 3D printing, presenting geometric interlocking as a robust alternative to fusion-based bonding techniques and contributing to the durability of printed structures.
- New
- Research Article
- 10.3389/fsoil.2025.1673628
- Nov 6, 2025
- Frontiers in Soil Science
- Carlos Carbajal-Llosa + 2 more
In agricultural systems, soil pH and electrical conductivity (EC) are crucial chemical properties that directly affect nutrient availability and microbial activity, but the challenging environment of the Peruvian Andes has limited research on their estimation. This study aimed to develop an ensemble learning method to predict soil pH and EC in Andean agroecosystems using environmental predictors. By using simple and weighted averaging, we developed a heterogeneous ensemble learning approach that integrates machine learning (ML) algorithms, including Support Vector Machine (SVM), Artificial Neural Network (ANN), Random Forest (RF), and Extreme Gradient Boosting (XGBoost). The weighted ensemble assigns weights to models based on their predictive accuracy, measured by R² from spatial cross-validation. Spatial patterns are noticeable, and pH displays greater spatial clustering than EC. Elevation was the most important predictor in ML models for both parameters. Ensemble models significantly outperformed individual models, with the weighted ensemble achieving R² >0.93 and reducing RMSE by approximately 72%. Among standalone models, RF and XGBoost performed best for pH, while SVM performed the best for EC. ANN models were the least effective. Uncertainty analysis indicated high confidence in pH predictions but moderate to high uncertainty in EC predictions, suggesting that EC is more challenging to predict. Ensemble models with optimized weighting provide robust and accurate mapping of spatially autocorrelated soil properties. The high-confidence pH maps are reliable for soil management decisions, while EC predictions, though more uncertain, effectively identify priority areas for future sampling and investigation.
- New
- Research Article
- 10.3389/fpubh.2025.1446073
- Nov 6, 2025
- Frontiers in Public Health
- Amirkeyvan Ghazvinian + 2 more
Introduction Millions of people living with HIV around the world depend on having access to antiretroviral (ARV) drugs, yet the supply chain continues to confront obstacles like rising freight costs and delivery delays. These inefficiencies put timely access to life-saving medications at risk, especially in resource-limited settings. To find ways to improve the HIV drug supply chain, this study looks into the underlying causes of these disruptions. Objectives This study aims to: (1) assess and prioritize risks in the HIV drug supply chain, focusing on failure modes impacting delivery timelines and freight costs; and (2) enhance supply chain substantivity (fulfillment capacity) and resilience (disruption adaptability) through evidence-based strategies. Methods Using Z-numbers to handle uncertainty, we developed a hybrid multi-criteria decision-making framework that integrates Z-SWARA, Z-WASPAS, and Z-DEA-FMEA. Along with using FMEA to assess risks and identify failure modes, the method ranks them based on freight costs and delivery timeliness, using hybrid rankings, RPN, Z-SWARA/Z-WASPAS, and Z-DEA-FMEA efficiencies. Results Hybrid rankings indicate that the primary contributors to supply chain inefficiencies are Quantity Errors (F14, ranked 1st, 𝑄𝑡𝑜𝑡𝑎𝑙=0.9374), Pack Price Discrepancies (F16, ranked 2nd, 0.8430), and Unit Miscalculation (F13, ranked 3rd, 0.7261). The Z-WASPAS analysis emphasizes the financial implications of F16, placing it at the top for Freight Costs ( K = 0.178). Additionally, Z-DEA-FMEA notes efficiency shifts including Delivery Confirmation (F06, 𝜃=0.7303, Delivery). In the case of Weight Failures (F20), the Freight score (𝑄𝑖=0.6991, ranked 3rd) surpasses that of Delivery (0.6753, ranked 4th), while Shipment Mode Selection (F04) holds the 5th position overall (𝑄𝑡𝑜𝑡𝑎𝑙=0.6741). Discussion Aiming to improve the availability of antiretroviral (ARV) medications, our approach integrates risk, uncertainty, and efficiency analysis to formulate evidence-based strategies by utilizing Z-numbers. It redefines concepts of resilience and substantivity, providing decision-makers with a framework to enhance delivery speed and minimize costs. These improvements strengthen global health logistics.
- New
- Research Article
- 10.3390/app152111815
- Nov 5, 2025
- Applied Sciences
- Lucas Álvarez-Piñeiro + 3 more
This study evaluates the feasibility of fully renewable energy systems on El Hierro, the smallest and most isolated Canary Archipelago Island (Spain), contributing to the broader effort to decarbonize the European economy. By 2040, the island’s energy demand is projected to reach 80–110 GWh annually, assuming full economic decarbonization. Currently, El Hierro faces challenges due to its dependence on fossil fuels and inherent variability of renewable sources. To ensure system reliability, the study emphasizes the integration of renewable and storage technologies. Two scenarios are modeled using HOMER Pro 3.18.4 software with probabilistic methods to capture variability in generation and demand. The first scenario, BAU, represents the current system enhanced with electric vehicles. While the second, Efficiency, incorporates energy efficiency improvements and collective mobility policies. Both prioritize electrification and derive an optimal generation mix based on economic and technical constraints, to minimize Levelized Cost Of Energy (LCOE). The approach takes advantage of El Hierro’s abundant solar and wind resources, complemented by reversible pumped hydro storage and megabatteries. Fully renewable systems can meet demand reliably, producing about 30% energy surplus with an LCOE of roughly 10 c€/kWh. The final BAU scenario includes 53 MW of solar PV, 16 MW of wind, and a storage system of 40 MW–800 MWh. The Efficiency scenario has 42 MW of solar PV, 11.5 MW of wind, and 35 MW–550 MWh of storage. Uncertainty analysis indicates that maintaining system reliability requires an approximate 10% increase in both installed capacity and costs. This translates into an additional 7 MW of solar PV and 6 MW–23.5 MWh of batteries in the BAU, and 6 MW and 4 MW–16 MWh in the Efficiency.
- New
- Research Article
- 10.1021/acs.jchemed.5c00634
- Nov 5, 2025
- Journal of Chemical Education
- Chrystian De Oliveira Bellin + 1 more
Optimizing Low-Cost Educational Calorimeters: A Quantitative Analysis of Experimental Uncertainty
- New
- Research Article
- 10.1115/1.4070104
- Nov 5, 2025
- ASME Journal of Heat and Mass Transfer
- Rishav Aich + 1 more
Abstract The hydrodynamic and thermal behavior of flow through a rectangular channel filled with fluid-saturated anisotropic porous media is investigated numerically. The flow is governed by the Darcy–Brinkman model, incorporating hydrodynamic and thermal anisotropy. The finite volume method (FVM) is implemented to solve these coupled partial differential equations. The permeability tensor is characterized by the principal permeabilities and their orientation angle with respect to the axial direction. This study examines the impact of magnetic field, anisotropy, and viscous dissipation on velocity profiles, skin friction, and heat transfer, focusing on the local Nusselt number. Interestingly, introducing anisotropy enhances heat transfer under specific conditions compared to the isotropic case. Additionally, increasing the Hartmann number further modifies flow behavior, leading to enhanced heat transfer in the axial direction. The clear fluid compatible model demonstrates improved heat transfer efficiency over the other two models. Anisotropy can significantly enhance heat transfer efficiency, achieving an improvement of over 42% compared to the isotropic case. However, under certain conditions, it can also reduce the efficiency of the heat transfer process by nearly 39%. A correlation has been developed to describe the dependence of the local Nusselt number on various parameters. Sensitivity analysis identified the most influential parameters affecting the local Nusselt number, while uncertainty analysis confirmed the robustness and reliability of the developed correlation for practical heat transfer applications. Present results are validated with published experimental and numerical data. The insights gained from this study are vital for optimizing the design of thermal systems utilizing anisotropic porous materials.
- New
- Research Article
- 10.1088/1681-7575/ae1b0a
- Nov 4, 2025
- Metrologia
- Sean Jollota + 5 more
Abstract Background/Purpose: Radiopharmaceutical therapy (RPT) with alpha-emitting radionuclides, such as 225Ac, offers highly localized dose delivery due to its short particle range and high linear energy transfer (LET). However, unlike external beam radiotherapy (EBRT) and brachytherapy, which have traceable absorbed dose standards, RPT currently lacks a standardized absorbed dose measurement framework. This study aims to quantify the absorbed dose to air from a 225Ac source using an extrapolation chamber, supported by Monte Carlo (MC) simulations, to establish a robust methodology for dose validation of computational methods.
Methods: An extrapolation chamber was used to measure the absorbed dose to air from a drop casted 225Ac source, with source activity determined using a Low-Energy Germanium (LEGe) detector. High-resolution 2D imaging characterized the spatial distribution of deposited activity, enabling precise source geometry modeling for MC simulations. Self-attenuation effects were quantified using alpha spectrometry, and automated voltage control improved measurement repeatability of the extrapolation chamber. Absorbed dose calculations were compared between experimental and MC results across multiple air gaps.
Results: Experimental and simulated absorbed dose values were in strong agreement, with experimental measurements consistently 1–2% higher than MC predictions across all air gaps. The ionizing-radiation Quantum Imaging Detector (iQID) system provided activity mapping for source characterization, reducing uncertainties in MC modeling. The integration of automated chamber voltage control enhanced measurement precision, while uncertainty analyses highlighted activity determination and alignment as key contributors to variability.
Conclusions: This study establishes a validated methodology for quantifying 225Ac absorbed dose using extrapolation chamber measurements. The findings support the development of traceable absorbed dose standards for RPT and highlight the need for further refinement in alignment protocols and activity quantification. Future work should explore comparisons with time-integrated activity (TIA)-based absorbed dose calculations to align experimental methodologies with clinical RPT dosimetry practices.
- New
- Research Article
- 10.30564/fls.v7i12.11494
- Nov 4, 2025
- Forum for Linguistic Studies
- Suleiman Ibrahim Mohammad + 6 more
This paper presents a mathematical framework for quantifying graded language mixing in media texts surrounding a policy reform. We model each document as generated by probabilistic n-gram models for two languages, interpret the resulting posterior probabilities as soft-membership degrees, and apply Shannon entropy to measure per-document mixing. A fuzzification exponent controls assignment sharpness, and aggregate entropy across documents yields a corpus-level metric tracked over pre- and post-reform intervals. In a case study of 20 headlines, mean entropy rose from 0.52 to 0.68 nats (∆ = 0.16), indicating increased code-mixing after the policy change. Statistical validation via a paired t-test (t = 3.27, p < 0.01) and a permutation test (p = 0.005) confirms the significance of this shift. Analysis of soft-membership distributions reveals a drop in average English membership from 0.77 to 0.52, further illustrating editorial adaptation. The modular implementation enables scalable analysis of large corpora, and an open-source toolkit is provided to promote reproducibility and extension to other bilingual or multilingual settings. We discuss limitations related to parameter sensitivity, model assumptions, and sample size, and outline future extensions involving imprecise-probability bounds, contextual embeddings, dynamic time-series modeling, and topic-augmented uncertainty. Our results demonstrate the power of information-theoretic tools for detecting subtle shifts in media discourse in response to regulatory changes.
- New
- Research Article
- 10.3390/inventions10060100
- Nov 4, 2025
- Inventions
- Jaime Sánchez Gallego
This paper develops a theoretical framework and a numerical implementation for real-time estimation of the gross mass of heavy vehicles using only on-board signals: tire inflation pressure from the TPMS and radial deformation inferred from a monocular chassis camera. Each wheel is modeled as a single-degree-of-freedom radial oscillator with pressure-dependent stiffness kr(P) and damping cr(P). The contact patch geometry follows a compressed-arc approximation that maps radial deformation δ to contact length L(δ) and area S(δ). Two independent force surrogates are constructed—Fk=kr(P)δ and Fq=q(P)S(δ), where q(P) denotes the mean contact pressure—and fused by an adaptive Kalman filter operating at 30 Hz to recover per-wheel loads and total mass. Tuning the fusion weight λ yields a relative mass estimation error below 5% across 0.001≤δ≤0.20 m, and the maximum observed error is 4.99%. Numerical experiments using fixed-step RK4 and embedded RK45 methods confirm the accuracy and real-time feasibility on commodity hardware (runtime <33 ms per step). Uncertainty analysis based on Latin hypercube sampling, the PRCC, and Sobol indices shows robustness to parameter perturbations (±5% inflation, ±10% stiffness, ±15% damping, ±1° camera pitch, ±2 kPa TPMS bias). Observability analysis supports identifiability under the tested regimes. The estimator delivers wheel and axle loads for on-board alerts, telematics, V2X pre-screening for road user charging and weigh-in-motion technology, and friction-aware control.
- New
- Research Article
- 10.1016/j.wasman.2025.115143
- Nov 1, 2025
- Waste management (New York, N.Y.)
- Xi He + 5 more
Critical evaluation of technological, environmental, and economic recyclability for typical EOL PV cells considering technological iteration and temporal dynamics.
- New
- Research Article
- 10.1016/j.oregeorev.2025.106844
- Nov 1, 2025
- Ore Geology Reviews
- Lahiru M.A Nagasingha + 2 more
A generative neural network approach to uncertainty and risk-return analysis in mineral prospectivity modelling
- New
- Research Article
- 10.1016/j.actaastro.2025.08.008
- Nov 1, 2025
- Acta Astronautica
- Junkai Jia + 5 more
Static and dynamic uncertainty analyses for thrust regulation performance of solid divert and attitude control motor
- New
- Research Article
- 10.1016/j.compbiomed.2025.111133
- Nov 1, 2025
- Computers in biology and medicine
- P Vaidehi + 2 more
Magnetically controlled oscillatory flow of micropolar-Jeffrey fluid in a diseased artery: A sensitivity analysis-based model for non-invasive thrombosis treatment.
- New
- Research Article
- 10.1063/5.0303249
- Nov 1, 2025
- Journal of Renewable and Sustainable Energy
- Jia-Hua Li + 5 more
To promote the utilization of renewable energy and improve the economic, low-carbon, and energy efficiency of the integrated energy system. This study proposes an optimal capacity allocation and scheduling model for a combined cooling, heating, and power system with wind, solar, hydrogen, and carbon collaboration. The model couples new energy devices and energy storage devices, and integrates carbon capture, a ladder carbon trading mechanism, and diversified hydrogen utilization. Using ε-constraint algorithm and technique for order preference by similarity to ideal solution, in combination with the entropy weight method, are used to solve for the annual total cost (ATC), the annual carbon emissions (ACE), and the primary energy consumption (PEC) under six different scenarios optimize system configuration and operation scheduling strategies, and study the impact of each component on the overall performance of the system. The paper also discusses the sensitivity analysis of scenario 5 based on carbon price, hydrogen doping ratio, and storage cap, and the uncertainty analysis of efficiency fluctuations of photovoltaic thermoelectric modules. Simulation results show that the proposed model reduces 5981.57 tons of ACE and 959 000 m3 of PEC while achieving a lower ATC ($592 230) compared to the conventional system, with good stability and robustness of the system. These findings underscore the potential of the proposed system to accelerate decarbonization, improve operational efficiency, and support the transition to a more sustainable energy future.
- New
- Research Article
- 10.1016/j.oceaneng.2025.122185
- Nov 1, 2025
- Ocean Engineering
- Ren-Jie Wu + 1 more
Multi-scale uncertainties analysis method for marine RC structures subjected to spatiotemporal random deterioration
- New
- Research Article
- 10.1016/j.enconman.2025.120120
- Nov 1, 2025
- Energy Conversion and Management
- Dibyendu Roy + 5 more
Multi-criteria decision-making and uncertainty analyses of off-grid hybrid renewable energy systems for an island community
- New
- Research Article
- 10.1038/s41598-025-22136-6
- Oct 31, 2025
- Scientific Reports
- Rômulo Marques-Carvalho + 5 more
This study applies the Gaussian Analytical Hierarchy Process (Gaussian AHP) to landslide susceptibility mapping and demonstrates its superior methodological rigor and predictive performance relative to the traditional AHP method. Susceptibility maps produced by Gaussian AHP allocated 26.31% of the study area to the very high susceptibility class, outperforming the traditional AHP’s estimated share (23.52%), and achieved a more balanced distribution across all five classes. Validation against a high resolution inventory of 97,742 landslide samples collected during the February 2023 São Sebastião event—divided into 70% training and 30% validation subsets—yielded improved metrics: ROC area under the curve of 0.6360 versus 0.6220; overall accuracy of 0.6364 versus 0.6229, balanced accuracy of 0.6356 versus 0.6221; and sensitivity of 0.3585 versus 0.3116, for the Gaussian and traditional AHP methods respectively. An uncertainty analysis quantified a 56.16% disagreement between the two methods, revealing that Gaussian AHP reduced classification ambiguity in critical classes. A complementary density-based assessment, comparing observed landslide crown points and scar polygons against susceptibility class areas, showed that Gaussian AHP produced a gradual, coherent increase in normalized landslide density from very low to very high susceptibility, whereas traditional AHP displayed sharp breaks in intermediate classes. These findings confirm that Gaussian AHP enhances objectivity, spatial coherence, and operational reliability, better aligning high density landslide clusters with the highest susceptibility zones. By leveraging statistical weighting, Gaussian AHP streamlines data preprocessing and reduces the need for expert calibration, making it well suited for assessments in data rich environments. The statistical weighting procedure facilitates the integration of diverse geospatial datasets and supports robust, reproducible multicriteria decision analysis. Its integration with accurate machine learning-derived land use/land cover data and refined climate data is recommended to further improve predictive accuracy and support proactive landslide risk management strategies. The proposed approach can additionally meet operational purposes, provided that near real-time climate data, updated geospatial databases, and massive computing resources are available.
- New
- Research Article
- 10.3390/en18215744
- Oct 31, 2025
- Energies
- Qiannan Yu + 6 more
Unconventional reservoirs are critical for future energy supply, but present major challenges for predictions of production due to their ultra-low permeability, strong pressure sensitivity, and non-Darcy flow. Mechanistically grounded physics-based models depend on uncertain parameters derived from laboratory tests or empirical correlations, limiting their field reliability. A data-enhanced variable start-up pressure gradient framework is developed herein, integrating flow physics with physics-informed neural networks (PINNs), surrogate models, and Bayesian optimization. The framework adaptively refines key parameters to represent spatial and temporal variability in reservoir behavior. Validation with field production data shows significantly improved accuracy and robustness compared to baseline physics-based and purely data-driven approaches. Sensitivity and uncertainty analyses confirm the physical consistency of the corrected parameters and the model’s stable predictive performance under perturbations. Comparative results demonstrate that the data-enhanced model outperforms conventional models in accuracy, generalization, and interpretability. This study provides a unified and scalable approach that bridges physics and data, offering a reliable tool for prediction, real-time adaptation, and decision support in unconventional reservoir development.
- New
- Research Article
- 10.1088/1361-6560/ae1543
- Oct 30, 2025
- Physics in Medicine & Biology
- Han Li + 4 more
Objective.Transcranial focused ultrasound (tFUS) for neuromodulation has attracted increasing attention, yet accurate pre-procedural planning and dose estimation is constrained by oversimplified skull representations and by the neglect of transducer-skull spacing induced wave interactions. This study aims to develop and validate a computationally efficient, CT-informed analytical framework for predicting frequency-dependent insertion loss.Approach.We propose a multi-layer analytical framework that incorporates four key factors-skull thickness, skull density ratio, ultrasound insertion angle, and the transducer physical geometry and spacing from the skull, to predict frequency-dependent pressure insertion loss. Model accuracy was evaluated against k-Wave simulations and hydrophone measurements in 20ex-vivohuman skulls across 100 kHz to 1000 kHz frequency range.Main Results.Median prediction deviations for peak pressure insertion loss were +1.1 dB (interquartile range (IQR): +0.2 dB to +2.2 dB) relative to measurement and -1.7 dB (IQR: -2.7 dB to -0.7 dB) relative to simulation. The relative median percentage errors were +30.1% (IQR: +9.5% to +35.6%) and -20.3% (IQR: -31.7% to -10.1%), respectively. Median spearman correlation and cosine similarity values reached 0.92 (IQR: 0.86-0.98,p< 0.001) and 0.73 (IQR: 0.49-0.82), respectively. Uncertainty analysis showed that varying transducer-skull spacing resulted in a median absolute percentage uncertainty of 18.1% (IQR: 17.2% to 21.3%).Significance.The balance of accuracy and efficiency of the proposed CT-informed multi-layer model makes it a practical tool for transducer positioning, frequency selection, and dose control in tFUS neuromodulation, with potential to improve reproducibility and safety in clinical applications.
- New
- Research Article
- 10.1080/00084433.2025.2578556
- Oct 29, 2025
- Canadian Metallurgical Quarterly
- Vijay Pratap Singh + 6 more
ABSTRACT This work investigates the accumulative roll bonding (ARB) induced tribological behaviour of Al6061 alloy using experimental evaluation and predictive modelling through response surface methodology (RSM) and machine learning (ML) algorithms. The novelty of this study lies in its integrated methodology combining strain-induced grain refinement by multi-pass ARB with regression modelling and uncertainty analysis to quantitatively optimise tribological performance under dry sliding contact. Unlike earlier studies that address ARB or ML separately, this work provides a unified framework linking microstructural evolution with data-driven predictions. The alloy underwent up to five ARB passes and tribological testing on a ball-on-flat tribometer under ASTM G133-05 standards. The findings showed significant improvements with increasing ARB passes, where the 5-pass sample at 0.36 m s–1 sliding velocity achieved the lowest coefficient of friction (0.12048) and wear rate (0.000138 mm3 N·m–1). FESEM–EDAX analysis confirmed grain refinement, stable tribolayer formation, and reduced surface oxygen from 30.4 wt-% to 12.1 wt-%. ML models (R 2 > 0.99) validated these results, with Random Forest further quantifying uncertainty, 92% of predictions for COF and 90% for wear rate lay within 95% confidence intervals. The integrated ARB–ML approach demonstrates a reliable pathway for wear performance improvement with engineering implications in aerospace, defense, and automotive applications.