Abstract

PurposeThe historical data usually consist of overlapping reports, and these reports may contain inconsistent data, which may return incorrect results of a query search. Moreover, users who issue the query may not learn of this inconsistency even after a data cleaning process (e.g. schema matching or data screening). The inconsistency can exist in different types of data, such as temporal or spatial data. Therefore, this paper aims to introduce an information fusion method that can detect data inconsistency in the early stages of data fusion.Design/methodology/approachThis paper introduces an information fusion method for multi-robot operations, for which fusion is conducted continuously. When the environment is explored by multiple robots, the robot logs can provide more information about the number and coordination of targets or victims. The information fusion method proposed in this paper generates an underdetermined linear system of overlapping spatial reports and estimates the case values. Then, the least squares method is used for the underdetermined linear system. By using these two methods, the conflicts between reports can be detected and the values of the intervals at specific times or locations can be estimated.FindingsThe proposed information fusion method was tested for inconsistency detection and target projection of spatial fusion in sensor networks. The proposed approach examined the values of sensor data from simulation that robots perform search tasks. This system can be expanded to data warehouses with heterogeneous data sources to achieve completeness, robustness and conciseness.Originality/valueLittle research has been devoted to the linear systems for information fusion of tasks of mobile robots. The proposed information fusion method minimizes the cost of time and comparison for data fusion and also minimizes the probability of errors from incorrect results.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.