From mathematical to AI-based methods: A review of marine PNT data fusion and uncertainty handling

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

From mathematical to AI-based methods: A review of marine PNT data fusion and uncertainty handling

Similar Papers
  • Research Article
  • Cite Count Icon 79
  • 10.1007/s10479-021-04006-2
Handling of uncertainty in medical data using machine learning and probability theory techniques: a review of 30 years (1991–2020)
  • Mar 21, 2021
  • Annals of Operations Research
  • Roohallah Alizadehsani + 14 more

Understanding the data and reaching accurate conclusions are of paramount importance in the present era of big data. Machine learning and probability theory methods have been widely used for this purpose in various fields. One critically important yet less explored aspect is capturing and analyzing uncertainties in the data and model. Proper quantification of uncertainty helps to provide valuable information to obtain accurate diagnosis. This paper reviewed related studies conducted in the last 30 years (from 1991 to 2020) in handling uncertainties in medical data using probability theory and machine learning techniques. Medical data is more prone to uncertainty due to the presence of noise in the data. So, it is very important to have clean medical data without any noise to get accurate diagnosis. The sources of noise in the medical data need to be known to address this issue. Based on the medical data obtained by the physician, diagnosis of disease, and treatment plan are prescribed. Hence, the uncertainty is growing in healthcare and there is limited knowledge to address these problems. Our findings indicate that there are few challenges to be addressed in handling the uncertainty in medical raw data and new models. In this work, we have summarized various methods employed to overcome this problem. Nowadays, various novel deep learning techniques have been proposed to deal with such uncertainties and improve the performance in decision making.

  • Research Article
  • Cite Count Icon 19
  • 10.1016/j.pmcj.2016.09.004
Uncertainty handling in rule-based mobile context-aware systems
  • Oct 5, 2016
  • Pervasive and Mobile Computing
  • Szymon Bobek + 1 more

Uncertainty handling in rule-based mobile context-aware systems

  • Research Article
  • Cite Count Icon 44
  • 10.1029/94jb00803
On the handling of uncertainties in estimating the hazard of rupture on a fault segment
  • Jul 10, 1994
  • Journal of Geophysical Research: Solid Earth
  • D A Rhoades + 2 more

Uncertainties in data and parameter values have often been ignored in hazard estimates based on historic and prehistoric records of rupture on fault segments. A mixture of distributions approach is appropriate to handle uncertainties in parameters of recurrence time distributions estimated from the geological and historical earthquake record of a fault segment, and a mixture of hazards approach is appropriate for data uncertainties and for uncertainties in parameters estimated from a set of similar faults. The former approach admits updating of the distributions for uncertainty as time passes. The aim is to present the hazard as a single value which takes account of both data and parameter uncertainties, conditional only on modeling assumptions. The proposed methods are described in detail for the exponential and lognormal recurrence time models for fault‐rupturing earthquakes and applied, by way of illustration, to selected fault segments, namely, the Mojave segment of the San Andreas fault, California, and the Wellington‐Hutt Valley segment of the Wellington fault, New Zealand.

  • Research Article
  • Cite Count Icon 40
  • 10.1007/s10198-010-0236-4
Practical issues in handling data input and uncertainty in a budget impact analysis
  • Apr 3, 2010
  • The European Journal of Health Economics
  • M J C Nuijten + 2 more

The objective of this paper was to address the importance of dealing systematically and comprehensively with uncertainty in a budget impact analysis (BIA) in more detail. The handling of uncertainty in health economics was used as a point of reference for addressing the uncertainty in a BIA. This overview shows that standard methods of sensitivity analysis, which are used for standard data set in a health economic model (clinical probabilities, treatment patterns, resource utilisation and prices/tariffs), cannot always be used for the input data for the BIA model beyond the health economic data set for various reasons. Whereas in a health economic model, only limited data may come from a Delphi panel, a BIA model often relies on a majority of data taken from a Delphi panel. In addition, the dataset in a BIA model also includes forecasts (e.g. annual growth, uptakes curves, substitution effects, changes in prescription restrictions and guidelines, future distribution of the available treatment modalities, off-label use). As a consequence, the use of standard sensitivity analyses for BIA data set might be limited because of the lack of appropriate distributions as data sources are limited, or because of the need for forecasting. Therefore, scenario analyses might be more appropriate to capture the uncertainty in the BIA data set in the overall BIA model.

  • Research Article
  • 10.3138/m322-4g67-668p-2155
A Surface Approach to the Handling of Uncertainties in An Integrated Spatial Database Environment
  • Apr 1, 1996
  • Cartographica: The International Journal for Geographic Information and Geovisualization
  • Jingxiong Zhang

Motivated by the capabilities and flexibilities of surface data models and methods to capture spatial variabilities central to many geographical phenomena, this chapter explores the advantages of surface-based methods for error handling in spatial databases. At a general level, methods used in the past for representing and handling errors are described, along with their associated problems. Surface-based methods are then advocated as offering a unified strategy that can be used to represent different kinds of errors, as well as to indicate their spatial variability. This is followed by a description of fuzzy surfaces that can be used to represent uncertainties in categorial data. Empirical studies were undertaken in the context of local Edinburgh suburban land-cover mapping, using aerial photographs and remotely sensed images. The results are encouraging, and suggest further exploration and use of the surface-based methods.

  • Research Article
  • 10.1002/smr.2742
Prioritization of Software Bugs Using Entropy‐Based Measures
  • Nov 26, 2024
  • Journal of Software: Evolution and Process
  • Madhu Kumari + 2 more

ABSTRACTOpen‐source software is evolved through the active participation of users. In general, a user request for bug fixing, the addition of new features, and feature enhancements. Due to this, the software repositories are increasing day by day at an enormous rate. Additionally, user distinct requests add uncertainty and irregularity to the reported bug data. The performance of machine learning algorithms drastically gets influenced by the inappropriate handling of uncertainty and irregularity in the bug data. Researchers have used machine learning techniques for assigning priority to the bug without considering the uncertainty and irregularity in reported bug data. In order to capture the uncertainty and irregularity in the reported bug data, the summary entropy–based measure in combination with the severity and summary weight is considered in this study to predict the priority of bugs in the open‐source projects. Accordingly, the classifiers are build using these measures for different machine learning techniques, namely, k‐nearest neighbor (KNN), naïve Bayes (NB), J48, random forest (RF), condensed nearest neighbor (CNN), multinomial logistic regression (MLR), decision tree (DT), deep learning (DL), and neural network (NNet) for bug priority prediction This research aims to systematically analyze the summary entropy–based machine learning classifiers from three aspects: type of machine learning technique considered, estimation of various performance measures: Accuracy, Precision, Recall, and F‐measure and through existing model comparison. The experimental analysis is carried out using three open‐source projects, namely, Eclipse, Mozilla, and OpenOffice. Out of 145 cases (29 products X 5 priority levels), the J48, RF, DT, CNN, NNet, DL, MLR, and KNN techniques give the maximum F‐measure for 46, 35, 28, 11, 15, 4, 3, and 1 cases, respectively. The result shows that the proposed summary entropy–based approach using different machine learning techniques performs better than without entropy‐based approach and also entropy‐based approach improves the Accuracy and F‐measure as compared with the existing approaches. It can be concluded that the classifier build using summary entropy measure significantly improves the machine learning algorithms' performance with appropriate handling of uncertainty and irregularity. Moreover, the proposed summary entropy–based classifiers outperform the existing models available in the literature for predicting bug priority.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 14
  • 10.1016/j.scitotenv.2022.154992
DynSus: Dynamic sustainability assessment in groundwater remediation practice
  • Apr 2, 2022
  • Science of The Total Environment
  • Mehran Naseri-Rad + 5 more

Decision-making processes for clean-up of contaminated sites are often highly complex and inherently uncertain. It depends not only on hydrological and biogeochemical site variability, but also on the associated health, environmental, economic, and social impacts of taking, or not taking, action. These variabilities suggest that a dynamic framework is required for promoting sustainable remediation. For this, the decision support system DynSus is presented here for integrating a predeveloped contaminant fate and transport model with a sustainability assessment tool. Implemented within a system dynamics framework, the new tool uses model simulations to provide remediation scenario analysis and handling of uncertainty in various data. DynSus was applied to a site in south Sweden, contaminated with pentachlorophenol (PCP). Simulation scenarios were developed to enable a comparison between alternative remediation strategies and combinations of these. Such comparisons are provided for selected sustainability indicators and remediation performance (in terms of concentration at the recipient). This leads to identifying the most critical variables to ensure that sustainable solutions are chosen. Simulation results indicated that although passive practices, e.g., monitored natural attenuation, were more sustainable at first (5–7 years after beginning remediation measures), they failed to compete with more active practices, e.g., bioremediation, over the entire life cycle of the project (from the beginning of remedial action to achieving the target concentration at the recipient). In addition, statistical tools (clustering and genetic algorithms) were used to further assess the available hydrogeochemical data. Taken together, the results reaffirmed the suitability of the simple analytical framework that was implemented in the contaminant transport model. DynSus outcomes could therefore enable site managers to evaluate different scenarios more quickly and effectively for life cycle sustainability in such a complex and multidimensional problem.

  • Research Article
  • Cite Count Icon 17
  • 10.1007/s10669-013-9442-9
Uncertainty modelling in multi-criteria analysis of water safety measures
  • May 7, 2013
  • Environment Systems and Decisions
  • Andreas Lindhe + 4 more

Uncertainty modelling in multi-criteria analysis of water safety measures

  • Conference Article
  • Cite Count Icon 6
  • 10.1109/icmlc.2004.1382363
Uncertainty in spatial data mining
  • Aug 26, 2004
  • Bin-Bin He + 2 more

Spatial data refers to extracting and mining the hidden, implicit, valid, novel and interesting spatial or non-spatial patterns or rules from large-amount, incomplete, noisy, fuzzy, random, and practical spatial databases. In which an important issue but remains underdeveloped is to reveal and handle the uncertainties in spatial data mining. In This work, uncertainty of spatial data is briefly analyzed firstly, including the types and origins of uncertainty, their models of measurement and propagation. Then, some uncertainty factors in operation of spatial data are discussed and some uncertainty handling methods are adopted, including maximum variance data discretization and fuzzy belief function. Finally, we think the process of spatial data can be regarded as a complex system, a linear serial processing system in engineering control systems. An uncertainty propagation model of spatial data - fuzzy logic uncertainty propagation model with credibility factor is developed. Moreover, several key problems about uncertainty handling and propagation in spatial data are put forward.

  • Research Article
  • Cite Count Icon 15
  • 10.1016/j.asoc.2023.110142
Ship weather routing featuring w-MOEA/D and uncertainty handling
  • May 1, 2023
  • Applied Soft Computing
  • Rafal Szlapczynski + 2 more

Ship weather routing featuring w-MOEA/D and uncertainty handling

  • Research Article
  • 10.36962/piretc29082023-43
DATA FUSION OF ULTRASONIC SENSORS FOR DISTANCE MEASUREMENT
  • Nov 6, 2023
  • PIRETC-Proceeding of The International Research Education & Training Centre
  • Mahabbat Khudaverdieva Mahabbat Khudaverdieva

Distance measurement plays a fundamental role in contemporary technology, exerting a significant influence on applications across a myriad of sectors, ranging from autonomous navigation and industrial automation to robotics. Ultrasonic sensors, celebrated for their simplicity, cost-efficiency, and adaptability, have grown into essential instruments for gauging distances. Nevertheless, individual ultrasonic sensors are not exempt from constraints, which encompass inaccuracies stemming from environmental conditions, sensor noise, and measurement errors. This article delves into the advanced practice of data fusion, a method that entails amalgamating data from multiple ultrasonic sensors, bolstered by intricate algorithms like artificial intelligence (AI) and statistical techniques. The fusion of ultrasonic sensor data surpasses mere amalgamation; it harnesses the potency of AI to refine data integration. Consequently, this doesn't only enhance measurement precision but also equips systems to function dependably in dynamic and demanding environments. The article explores the importance of ultrasonic sensors, the necessity for data fusion, procedures for data preprocessing, the extraction of relevant features, the application of fusion algorithms, and the handling of uncertainties. It investigates the utilization of ultrasonic sensor data fusion in diverse domains such as autonomous vehicles, robotics, industrial automation, healthcare, and environmental monitoring. Moreover, it confronts issues associated with data quality, environmental variations, privacy concerns, and ethical deliberations. The article concludes by shedding light on the encouraging avenues that lie ahead in the field, which include the development of advanced algorithms, the seamless integration of different sensor types, real-time data processing, and a strong emphasis on safety-critical applications. Data fusion from ultrasonic sensors signifies an evolving and transformative technology with profound consequences for various sectors. The responsible advancement and deployment of this technology are of paramount importance to ensure the realization of its full potential, while maintaining the principles of safety, ethics, and reliability in a continuously evolving technological landscape. Keywords: Complex measurement, Multi-sensor distance measurement, Ultrasonic sensors, Data fusion, Measurement accuracy.

  • Research Article
  • Cite Count Icon 5
  • 10.1016/j.isatra.2020.12.021
Uncertain texture features fusion based method for performance condition evaluation of complex electromechanical systems
  • Dec 13, 2020
  • ISA Transactions
  • Rongxi Wang + 4 more

Uncertain texture features fusion based method for performance condition evaluation of complex electromechanical systems

  • Book Chapter
  • Cite Count Icon 11
  • 10.1016/b978-0-444-53632-7.01102-3
6.02 - Learning Rule-Based Models - The Rough Set Approach
  • Jan 1, 2014
  • Comprehensive Biomedical Physics
  • J Komorowski

6.02 - Learning Rule-Based Models - The Rough Set Approach

  • Research Article
  • Cite Count Icon 1
  • 10.1016/j.scitotenv.2025.180503
Advancements in soil moisture estimation through integration of remote sensing and artificial intelligence techniques.
  • Oct 1, 2025
  • The Science of the total environment
  • Harani P + 4 more

Advancements in soil moisture estimation through integration of remote sensing and artificial intelligence techniques.

  • Conference Article
  • Cite Count Icon 11
  • 10.1145/3362789.3362833
Qualifying and Quantifying Uncertainty in Digital Humanities
  • Oct 16, 2019
  • Patricia Martin-Rodilla + 2 more

Trabajo presentado en el 7th International Conference on Technological Ecosystems for Enhancing Multiculturality, celebrado en León (España), del 16 al 18 de octubre de 2019

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.