Abstract

Bayesian network classifiers remain of great interest in recent years, among which semi-naive Bayesian classifiers which utilize superparent one-dependence estimators (SPODEs) have shown superior predictive power. Linear weighting schemes are effective and efficient ones for linearly combining SPODEs, whereas it is a challenging task for averaged one-dependence estimators (AODE) to find globally optimal and fixed weights for its SPODE members. The joint probability distribution of SPODE may not always fit different test instances to the same extent, thus a flexible rather than rigid weighting scheme would be a feasible solution for the final AODE to approximate the true joint probability distribution. Based on this promise, we propose a novel instance-based weighting filter, which can flexibly assign discriminative weights to each single SPODE for different test instances. Meanwhile, the weight considers not only the mutual dependence between the superparent and class variable, but also the conditional dependence between the superparent and non-superparent attributes. Experimental comparison on 30 publicly available datasets shows that SPODE with instance-based weighting filter outperforms state-of-the-art BNCs with and without weighting methods in terms of zero–one loss, bias and variance with minimal additional computation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.