Abstract

We propose five different ways of integrating Dempster-Shafer theory of evidence and the rank nearest neighbor classification rules with a view to exploiting the benefits of both. These algorithms have been tested on both real and synthetic data sets and compared with the k-nearest neighbour rule (k-NN), m-multivariate rank nearest neighbour rule (m-MRNN), and k-nearest neighbour Dempster-Shafer theory rule (k-NNDST), which is an algorithm that also combines Dempster-Shafer theory with the k-NN rule. If different features have widely different variances then the distance-based classifier algorithms like k-NN and k-NNDST may not perform well, but in this case the proposed algorithms are expected to perform better. Our simulation results indeed reveal this. Moreover, the proposed algorithms are found to exhibit significant improvement over the m-MRNN rule.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.