Bayesian network classifiers remain of great interest in recent years, among which semi-naive Bayesian classifiers which utilize superparent one-dependence estimators (SPODEs) have shown superior predictive power. Linear weighting schemes are effective and efficient ones for linearly combining SPODEs, whereas it is a challenging task for averaged one-dependence estimators (AODE) to find globally optimal and fixed weights for its SPODE members. The joint probability distribution of SPODE may not always fit different test instances to the same extent, thus a flexible rather than rigid weighting scheme would be a feasible solution for the final AODE to approximate the true joint probability distribution. Based on this promise, we propose a novel instance-based weighting filter, which can flexibly assign discriminative weights to each single SPODE for different test instances. Meanwhile, the weight considers not only the mutual dependence between the superparent and class variable, but also the conditional dependence between the superparent and non-superparent attributes. Experimental comparison on 30 publicly available datasets shows that SPODE with instance-based weighting filter outperforms state-of-the-art BNCs with and without weighting methods in terms of zero–one loss, bias and variance with minimal additional computation.