Abstract

The application of multisource information fusion in real-world scenarios is an emerging practice because it effectively uses consistent and complementary data to optimize decision-making. Dempster-Shafer (D-S) evidence theory is prevalent because it competently handles uncertainty problems by assigning basic probability assignments (BPAs) to multielement subsets. However, a counterintuitive result may be obtained when the evidence is highly conflicting. To overcome this flaw, this paper defines a new divergence measurement to quantify the differences between BPAs; we name this new metric the belief Rényi divergence. The belief Rényi divergence takes the number of possible hypotheses into consideration, which makes it a more rational and effective difference measurement in the realm of evidence theory. Additionally, some important properties of the belief Rényi divergence are extensively explored and proven, in which the belief Rényi divergence also connects to Kullback-Leibler divergence, Hellinger distance and χ2 divergence. Moreover, a novel multisource information fusion method is devised based on the proposed belief Rényi divergence and belief entropy. Our proposed belief Rényi divergence can efficiently model the differences between evidence, and the belief entropy is used to calculate the information volume of evidence. Thus, the proposed method can sufficiently exploit the relationships among evidence and the information volume of the evidence itself. Two case studies are illustrated to verify the effectiveness and practicality of the proposed method. Also, an experiment on an iris dataset classification is presented to verify the performance of the proposed method. In addition, an EEG data analysis application demonstrates that the proposed method can be effectively used in real-world applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call