Abstract

Many distance-related algorithms depend upon a good distance metric to be successful. The Value Difference Metric, simply VDM, is proposed to find reasonable distance metric between each pair of instances with nominal attribute values only. In VDM, all of the attributes are assumed to be fully independent, and the difference between two values of an attribute is only considered to be closer if they have more similar correlation with the output classes. It is obvious that the attribute independence assumption in VDM is rarely true in reality, which would harm its performance in the applications with complex attribute dependencies. In this paper, we single out an improved Value Difference Metric by relaxing its unrealistic attribute independence assumption. We call it One Dependence Value Difference Metric, simply ODVDM. In ODVDM, the structure learning algorithms for Bayesian network classifiers, such as tree augmented naive Bayes, are used to find the dependence relationships among the attributes. Our experimental results validate its effectiveness in terms of classification accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.