Published in last 50 years
Articles published on Uncertain Data
- New
- Research Article
- 10.4028/p-bxn0sy
- Nov 11, 2025
- International Journal of Engineering Research in Africa
- Abdallah Nazih + 2 more
In this study, distributed generators (DGs) based on renewable energy sources (RESs), besides capacitor banks are optimally allocated in power distribution networks with a proposed multi-objective optimization approach. The proposed approach is used to maximize the hosting capacity (HC) of RES DGs besides decreasing energy loss and voltage deviation in power networks. Uncertainties of load demand and RESs are considered. To facilitate the optimization processes, reduction criterion is utilized for reducing the numerous numbers of uncertain data. The proposed approach is applied to practical and standard power networks for many cases under the uncertain scenarios. Comparative study with other algorithms is performed and robustness of proposed approach is verified in long-term dynamic environment. Also, impacts of changing parameters values on performance are investigated. Additionally, Wilcoxon statistical tests are applied with the proposed approach. Also, comparative study is carried out between weighted sum and Pareto front techniques. Results reveal efficacy of the proposed approach with distribution power networks.
- New
- Research Article
- 10.29020/nybg.ejpam.v18i4.7098
- Nov 5, 2025
- European Journal of Pure and Applied Mathematics
- Naser Odat
In engineering, manufacturing, and defense, reliability analysis is essential, yet conventional approaches frequently overlook ambiguous or inaccurate data. In order to account for constrained uncertainty in parameters, this paper presents a neutrosophic statistical framework for reliability estimation using the Kumaraswamy distribution. By applying conventional maximum likelihood estimation (MLE) to neutrosophic MLE, we are able to derive confidence intervals and stress-strength reliability functions under indeterminacy. Simulation findings show that the suggested strategy outperforms classical approaches in terms of robustness, maintaining about 95% coverage even with 20% parameter uncertainty. More relevant uncertainty quantification is provided by the neutrosophic intervals, which dynamically adjust to sample sizes and parameter constraints. Comparative studies show that while both approaches converge to the true reliability value as sample size grows, the Fisher Matrix method produces tighter confidence intervals than the Direct Interval approach. By providing engineers and decision-makers with a versatile tool for dependability assessment in real-world settings with ambiguous or inadequate data, our study closes the gap between theoretical rigor and actual application.
- New
- Research Article
- 10.3390/geosciences15110423
- Nov 5, 2025
- Geosciences
- Charalampos Ntigkakis + 2 more
Geological models form the basis for scientific investigations of both the surface and subsurface of urban environments. Urban cover, however, usually prohibits the collection of new subsurface data. Therefore, models depend on existing subsurface datasets that are often of poor quality and have an uneven spatial and temporal distribution, introducing significant uncertainty. This research proposes a novel method to mitigate uncertainty caused by clusters of uncertain data points in kriging-based geological modelling. This method estimates orientations from clusters of uncertain data and randomly selects points for geological interpolation. Unlike other approaches, it relies on the spatial distribution of the data and translating geological information from points to geological orientations. This research also compares the proposed approach to locally changing the accuracy of the interpolator through data-informed local smoothing. Using the Ouseburn catchment, Newcastle upon Tyne, UK, as a case study, results indicate good correlation between both approaches and known conditions, as well as improved performance of the proposed methodology in model validation. Findings highlight a trade-off between model uncertainty and model precision when using highly uncertain datasets. As urban planning, water resources, and energy analyses rely on a robust geological interpretation, the modelling objective ultimately guides the best modelling approach.
- New
- Research Article
- 10.1038/s41598-025-20809-w
- Oct 22, 2025
- Scientific reports
- Chaonan Zhang + 1 more
The selection of a team in team-based sporting activities and professional situations is a crucial decision that requires an optimal choice from multiple conditions. The solutions currently available do not simplify the issues of the best team selection under different alternatives. In the modern age, the theory of multi-attribute decision-making (MADM) is a well-known approach for assessing decision-making problems. The main objective of this article is to introducing new aggregation operator (AO) called rough Pythagorean Fuzzy Dombi weighted averaging (RPyFDWA) operator, analytic hierarchy process (AHP) for measuring the weights of alternatives and also investigated the organizing preference rankings for enrichment evaluation (PROMETHEE) for ranking of alternatives by imposing a lower and upper approximations, under rough Pythagorean fuzzy set (RPyFS) framework. The RPyFS framework has a superior structure for handling uncertain data compared to existing frameworks. We also establish significant mathematical characteristics of the operator, including idempotency, monotonicity, and boundedness, to determine its flexibility. We have provided a case study based on a basketball team. The proposed theory is applied to the solution of a non-theoretical example related to assessing team selection issues. We can use our proposed AOs to select the best team from the list of four teams based on various attributes like physical fitness, performance criteria, experience, age, and injury prevention. We notice that the team It is the best team among all other considered teams using the proposed RPyFDWA operator and PROMETHEE method. We provided a comparison between other current approaches and established AOs for analyzing the authenticity and validity of the proposed approach. Also, provide a sensitivity analysis of the proposed study to observe the change in input variables that affects the model's or decision's dependability. A solid conclusion is provided at the end.
- Research Article
- 10.1088/1748-9326/ae0fae
- Oct 14, 2025
- Environmental Research Letters
- Wendi Shao + 6 more
Abstract Abstract: Gridded population and flood hazard data are crucial for flood exposure assessments. However, current assessments incorporate uncertainties related to data selection, yet the mechanisms through which subjective data selection propagate uncertainties in exposure models remain poorly understood. To address this gap, this study conducted a comparative assessment of flood exposure in China using five population datasets and five flood hazard datasets. Furthermore, it explored the absolute and relative impacts of data uncertainties on 100-year return period flood exposure and discussed the underlying causes. Results exhibit substantial variations in flood exposure when different data combinations are employed. Specifically, there is a significant difference of 333 million individuals within the exposure range, with the highest estimate being 2.82 times the lowest one. Overall, the exposure variation was primarily from differences in flood hazards rather than population patterns, but their relative importance differed spatially depending on factors of slope, altitude, and artificial surface coverage. Despite the differences, all 25 data combinations revealed a disproportional larger share of population in floodplains, which was 2.28–3.49 times the share of floodplains. These findings are significant for understanding the uncertainties of flood exposure and can shield lights on informed policies for risk management.
- Research Article
- 10.1002/cjce.70122
- Oct 7, 2025
- The Canadian Journal of Chemical Engineering
- Jiandong Yang + 3 more
Abstract Industrial equipment measurement data are often subject to errors and uncertainties due to factors such as environmental conditions and equipment aging, posing significant risks to operational safety. To mitigate these issues, we propose a novel process monitoring method based on a multi‐feature space constrained stacked autoencoder (MFSCSAE), designed to reduce the impact of uncertainties. In real‐world industrial processes, uncertain data typically fluctuate within an interval centred around the true value. The MFSCSAE model incorporates multiple feature space constraints, using the upper and lower bounds of this interval as inputs, with the true measurement data serving as the reconstruction target. A new loss function is derived by combining the deviation between the model's output and the true target with the deviation between the features of the hidden layers. The model is trained on normal operational data, and control limits are determined using support vector data description (SVDD). These control limits are then used to assess whether the industrial process is functioning within acceptable bounds. The proposed method is applied to both the Tennessee‐Eastman (TE) process and a real industrial fluid catalytic cracking (FCC) process, demonstrating the effectiveness of the MFSCSAE model in monitoring uncertain processes.
- Research Article
- 10.1063/5.0298978
- Oct 1, 2025
- AIP Advances
- Jian Hua + 4 more
This paper proposes a groundwater level prediction method that integrates time-interval awareness with event-driven modeling, aiming to enhance model performance in non-stationary and abrupt hydrological processes. By incorporating event features into the attention mechanism, the framework effectively captures local mutations in groundwater level sequences, while probabilistic forecasting strengthens robustness against uncertain data. Experimental evaluations on eight monitoring wells from the California Department of Water Resources demonstrate that the proposed approach consistently outperforms multiple baseline models under diverse testing scenarios. Specifically, the method achieves an average reduction of 12.4% in MAE and 10.7% in RMSE, while the R2 metric exceeds 0.92. Even under conditions of high missing rates or perturbed timestamps, the model maintains stable predictive performance. These results confirm that the proposed framework delivers higher accuracy and stronger robustness in groundwater level forecasting under complex conditions, providing effective support for groundwater resource management and early warning applications.
- Research Article
- 10.1016/j.cie.2025.111335
- Oct 1, 2025
- Computers & Industrial Engineering
- Zezhou Zou + 1 more
Efficiency decomposition for the relational two-stage data envelopment analysis approach with uncertain data
- Research Article
- 10.1016/j.jlp.2025.105666
- Oct 1, 2025
- Journal of Loss Prevention in the Process Industries
- U Bhardwaj + 2 more
An advanced methodology for probabilistic risk assessment under limited and uncertain data: Application to offshore accidents
- Research Article
- 10.1038/s41598-025-03572-w
- Oct 1, 2025
- Scientific Reports
- Walid Emam + 5 more
Speech matching and sports training feature recognition have become increasingly significant in artificial intelligence and sports science, necessitating robust decision-making frameworks to address inherent uncertainty, hesitation, and cyclic behaviors in these domains. Current approaches to multi-criteria decision-making (MCDM) often fail to address uncertainties and interactions adequately. To overcome these limitations, this paper proposes the incorporation of complex picture fuzzy information measures (CPF-IM) to boost the accuracy of TOPSIS-based decision-making. Particularly, novel similarity measures (SMs) and distance measures (DMs) have been developed to cover real and imaginary components assigned to membership degree (MD), abstinence degree (AD), and non-membership degree (NDM) within a complex picture fuzzy set (CPFS). The evaluation method employs a real-world scenario in which four domain experts rated five speaker profiles under ten relevant criteria. Result outcomes indicate that the proposed model achieves consistent alternative rankings by detecting the interdependent relationships between acoustic and biomechanical parameters. The proposed CPF-TOPSIS approach surpasses other techniques in terms of accuracy and reliability, as evidenced by the results of comparative studies. The research establishes a new decision framework for speech and sports sciences, which enhances expert assessment decisions by accurately handling uncertain data, cyclical patterns, and evaluation hesitations.
- Research Article
- 10.1177/18758967251376723
- Sep 30, 2025
- Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
- Jingrong Wang + 5 more
Context In the realm of Smart Firefighting, research on multi-attribute decision modeling has yet to develop a decision-making model for AI-assisted fire rescue plans that integrates both subjective and objective information. Meanwhile, while studies using intuitionistic fuzzy sets provide valuable support for multi-attribute decision-making, they often focus only on data fuzziness and neglect decision-makers’ preferences for alternative plans and attributes. Objective The primary objective of this study is to develop a sophisticated multi-attribute decision-making model tailored specifically for urban fire emergency decision-making scenarios that integrates both subjective and objective information. Method To overcome the limitations of previous research, we develop a multi-attribute decision-making model for firefighting that integrates entropy weights and preference weights with intuitionistic fuzzy sets. Specifically, we first preprocess the decision attributes and experts’ preferences, transforming real-valued attributes into intuitionistic fuzzy numbers and determining the attribute weights based on expert preferences. We then introduce an overall framework for multi-attribute decision-making, comprising two models: the Fuzzy Entropy Weighted Multi-Attribute Decision-Making model, FEW-MADM and the Intuitionistic Fuzzy Sets based Integrated Entropy-Weighted and Preference-Weighted Multi-Attribute Decision-Making model IFPS-EWMADM, which realize the harmonious balance between data-driven insights and human expertise. Results We demonstrate an illustrative example in urban firefighting to evaluate the effectiveness of the proposed models. The results show that the FEW-MADM model is suitable for situations where the plan dataset contains no unexpected values, while the IFPS-EWMADM model is appropriate for scenarios involving complex and uncertain data. In addition, we conducted a sensitivity analysis, focusing on the conditions under which the original optimal solution would remain effective when key data in the case fluctuated within different ranges. Conclusion The proposed models calculate optimal solutions for various fire area sizes, which not only aid emergency management personnel in devising effective dispatching plans but also enable the analysis and summarization of historical cases. This dual functionality significantly propels the development of “smart firefighting” and markedly enhances the overall efficiency and effectiveness of fire emergency management.
- Research Article
- 10.26877/bd1eqb34
- Sep 30, 2025
- AKSIOMA: Jurnal Matematika dan Pendidikan Matematika
- Rifky Aisyatul Faroh + 4 more
The agricultural sector in Lamongan Regency is severely affected by the threat of drought, potentially affecting food production and farmers' welfare. This study aims to forecast the area of agricultural drought with a very severe category in Lamongan Regency using the Fuzzy Time Series-Markov Chain (FTS-MC) method. The data used includes the average drought area based on the processing of Landsat 8 imagery from 2020–2024, rainfall data, and historical drought data from the Regional Disaster Management Agency. The selection of the FTS-MC method is based on its strength in dealing with uncertain data and modeling drought condition transitions probabilistically. The forecasting results show that the pattern produced by FTS-MC closely follows the actual pattern, with good accuracy indicated by a MAPE value of 13.25%. The forecast result for January 2025 incerased 925,421.43 m². This finding is expected to serve as a reference for local governments and other stakeholders in planning drought mitigation, irrigation management, and more adaptive, data-driven food security strategies.
- Research Article
- 10.1038/s41598-025-13466-6
- Sep 30, 2025
- Scientific reports
- Jiaqi Wang
The rapid development of artificial intelligence (AI) and machine learning (ML) has revolutionized computer technology, enabling it to make intelligent decisions, exhibit adaptive behavior, and foster synergistic human-AI environments. To ensure that human-AI collaboration works effectively, analysis frameworks that can analyze uncertain, imprecise, and multi-perspective data are necessary. Most of the fuzzy decision-making algorithms introduced in this study are machine learning-inspired fuzzy algorithms that allow for the incorporation of human expert opinions by using T-spherical fuzzy sets (TSFS) and Aczel-Alsina aggregation operators. The algorithm represents subtle human opinions in four linguistic grades, including positive, abstention, negative, and refusal, and interprets them reliably in uncertain conditions. The framework utilizes tools of fuzzy logic, including tunable operators and defuzzification methods, to process expert data, prioritize AI tools, and facilitate valuable collaboration. This methodology is validated through a case study, where professionals evaluate AI-based systems at various levels, reporting increased trust, explainability, and flexibility in the human-AI collaboration setting.
- Research Article
- 10.52209/1609-1825_2025_3_482
- Sep 29, 2025
- TRUDY UNIVERSITETA
- Gulbakyt Ansabekova + 2 more
The hypothesis of application of fuzzy logic for protection and automatic control of the object of electric system is considered. Types of short circuit are considered as faults in power lines. The application of fuzzy logic in automation and control of electric system will allow: to accurately recognise types of short circuit on the basis of uncertain data of disturbance parameters; to disconnect the damaged section without bringing it to the critical moment; to increase the selectivity of disconnection; to increase the speed of protection and automatic control devices. Determination or identification of short-circuit types in 110 kV transmission lines with the help of fuzzy logic will allow to protect power system from abnormal modes, as well as to avoid technical and economic losses
- Research Article
- 10.18038/estubtda.1671595
- Sep 25, 2025
- Eskişehir Technical University Journal of Science and Technology A - Applied Sciences and Engineering
- Aslı Kaya Karakütük
Time series forecasting becomes more critical, especially when linear, non-stationary and uncertain data are available. This paper proposes an innovative hybrid model that combines wavelet transforms, high-order fuzzy cognitive maps and random forest regression for high-accuracy prediction of time series. This hybrid approach aims to overcome the limitations of traditional time series analysis methods. In the approach, wavelet coefficients are enhanced by the integration of higher-order fuzzy cognitive maps, which allows for efficient modeling of nonlinear relationships through quadratic interactions. Four different wavelet transforms (Morlet, Mexican Hat, Haar, Daubechies) are used in the model and these structures are systematically compared. The model is trained with a Random Forest Regression model and hyperparameter optimization was performed using GridSearchCV. Model performance is evaluated on data sets of atmospheric CO2 concentrations, El Niño Sea surface temperatures and sunspot activity records from Mauna Loa Observatory. A comprehensive analysis is presented by evaluating the model performances with multiple metrics such as symmetric mean absolute percentage error, root mean squared error, mean absolute percentage error, and mean absolute scaled error. The experimental results show that the Mexican Hat wavelet performs the best in all data sets. It outperformed the other methods with root mean square error values of 0.2414 on CO2 data, 0.1621 root mean square error on El Niño temperature prediction and 4.3279 root mean square error on sunspot activity. While continuous wavelets are more successful than discrete wavelets, the integration of hybrid structure significantly improves the model's ability to capture nonlinear relationships. This research not only proves the superiority of wavelet-based hybrid models in time series analysis, but also demonstrates the potential for practical application in areas such as climate, meteorology, and space weather studies.
- Research Article
- 10.1080/23737484.2025.2550445
- Sep 16, 2025
- Communications in Statistics: Case Studies, Data Analysis and Applications
- Tara Mohammadi + 2 more
Multi-class SVM research is ongoing research, but methods modeled for precise data can be less accurate due to measurement, and modeling errors. In such a situation, we face a set of uncertain data sets. This paper introduces a multi-class SVM using regular simplex for stochastic inputs. The presented model is based on a regular simplex support vector machine with probabilistic constraints, which is investigated in two states of known and unknown populations. We put a noise in the constraint which comes up with a known distribution. We release the constraints from the probabilistic state using statistical theories and moment estimation. In the simulation part, we use multinomial logistic regression to fit the relationship between features and labels through the Monte Carlo method. We demonstrate that the proposed model is more efficient than the model relying on accurate data. This paper presents an improved version of the RSSVM model.
- Research Article
- 10.1080/1448837x.2025.2555023
- Sep 15, 2025
- Australian Journal of Electrical and Electronics Engineering
- Qiang Huang + 2 more
ABSTRACT The integration of variable renewable energy (RE) into modern energy systems introduces significant variability and uncertainty, posing challenges to effective multi-energy system (MES) operation. This study proposes a novel optimisation framework that integrates grey number theory with a modified whale optimisation algorithm (WOA) to address these challenges. Grey number theory models incomplete and uncertain data, while WOA, a meta-heuristic algorithm, identifies optimal operational strategies. A dual-objective function is formulated to minimise both operational costs and greenhouse gas (GHG) emissions. The framework is validated on an MES case study comprising wind–solar conversion, conventional energy sources, and storage systems. Results demonstrate substantial improvements compared to conventional approaches, including a 22% reduction in operational costs, 19% reduction in GHG emissions, and up to 15% improvement in system reliability. These outcomes highlight the robustness and practicality of integrating grey theory with WOA for managing RE uncertainties. The proposed methodology not only enhances economic and environmental performance but also supports a more sustainable and reliable MES operation.
- Research Article
- 10.1371/journal.pone.0330729
- Sep 2, 2025
- PLOS One
- Wei Zhuang + 1 more
As a common experimental technique, qPCR (Quantitative Real-time Polymerase Chain Reaction) is widely used to measure levels of nucleic acids, e.g., microRNAs and messenger RNA. While providing accurate and complete data, researchers have inevitably encountered uncertainly determined qPCR data because of intrinsically low amounts of biological material. The presence of incomplete or uncertain qPCR data challenges interpretation accuracy. This study presents a web application that integrates two sophisticated statistical methods – a flexible regression approach and a two-group hypothesis testing technique – to enhance the accuracy and robustness of qPCR data analysis with informative but uncertainly determined observations. To demonstrate the versatility and efficacy of our MCTOT (Multi-Functional Cycle-To-Threshold Statistical Analysis Tool) application, this study presents two distinct examples employing two-group hypothesis testing. The first example delves into an analysis of pathogens in wastewater, an area gaining increasing relevance for public health surveillance. The second example illustrates an application in the realm of liquid biopsy, a rapidly evolving field in disease diagnostics, monitoring, and early treatment. Moreover, the application’s process is further exhibited through another liquid biopsy example, wherein the flexible regression method is employed to detect the hemolysis effect on a molecular target. These examples demonstrate the tool’s capacity to not only identify significant differences between groups but also to quantify the effect size, a crucial aspect in biomedical research. The MCTOT web application stands as a pioneering step toward empowering researchers to harness the full potential of qPCR data, especially when dealing with informative but uncertainly determined observations. It also paves the way for further development of web-based tools that adhere to the refined CTOT (Cycle-To-Threshold) methodology, opening new avenues in qPCR data analysis and interpretation. The developed application can be accessed online through Shinyapps.io at https://ctot.shinyapps.io/bioinformatics/ for open access.
- Research Article
- 10.1109/tnnls.2025.3563889
- Sep 1, 2025
- IEEE transactions on neural networks and learning systems
- Jie Yang + 6 more
The granular-ball (GB)-based classifier introduced by Xia exhibits adaptability in creating coarse-grained information granules for input, thereby enhancing its generality and flexibility. Nevertheless, the current GB-based classifiers rigidly assign a specific class label to each data instance and lack the necessary strategies to address uncertain instances. These far-fetched certain classification approaches toward uncertain instances may suffer considerable risks. To solve this problem, we construct a robust three-way classifier with shadowed GBs (3WC-SGBs) for uncertain data. First, combined with information entropy, we propose an enhanced GB generation method with the principle of justifiable granularity. Subsequently, based on minimum uncertainty, a shadowed mapping is utilized to partition a GB into core region (COR), important region (IMP), and unessential region (UNE). Based on the constructed shadowed GBs, we establish a three-way classifier to categorize data instances into certain classes and uncertain case. Finally, extensive comparative experiments are conducted with two three-way classifiers, three state-of-the-art GB-based classifiers, and three classical machine learning classifiers on 12 public benchmark datasets. The results show that our model demonstrates robustness in managing uncertain data and effectively mitigates classification risks. Furthermore, our model almost outperforms the other comparison methods in both effectiveness and efficiency.
- Research Article
- 10.1016/j.neucom.2025.130669
- Sep 1, 2025
- Neurocomputing
- Ruihua Zhang + 4 more
Damped sliding based mining of average utility-driven sequential patterns over uncertain data streams