Abstract

Naive Bayes (NB) is easy to construct but surprisingly effective, and it is one of the top ten classification algorithms in data mining. The conditional independence assumption of NB ignores the dependency between attributes, so its probability estimates are often suboptimal. Hidden naive Bayes (HNB) adds a hidden parent to each attribute, which can reflect dependencies from all the other attributes. Compared with other Bayesian network algorithms, it offers significant improvements in classification performance and avoids structure learning. However, the assumption that HNB regards each instance equivalent in terms of probability estimation is not always true in real-world applications. In order to reflect different influences of different instances in HNB, the HNB model is modified into the improved HNB model. The novel hybrid approach called instance weighted hidden naive Bayes (IWHNB) is proposed in this paper. IWHNB combines instance weighting with the improved HNB model into one uniform framework. Instance weights are incorporated into the improved HNB model to calculate probability estimates in IWHNB. Extensive experimental results show that IWHNB obtains significant improvements in classification performance compared with NB, HNB and other state-of-the-art competitors. Meanwhile, IWHNB maintains the low time complexity that characterizes HNB.

Highlights

  • Published: 22 November 2021Bayesian network (BN) combines knowledge of network topology and probability

  • We propose the novel hybrid model which combines instance weighting with the improved Hidden naive Bayes (HNB) model into one uniform framework, referred to as instance weighted hidden naive Bayes (IWHNB)

  • We reviewed the related work about the existing instance weighting approaches and found that the Bayesian network in these researches is limited to Naive Bayes (NB)

Read more

Summary

Introduction

Bayesian network (BN) combines knowledge of network topology and probability. It is a classical method which can be used to predict a test instance [1]. Among various structure extension approaches, the hidden naive Bayes (HNB) is an improved model that essentially combines mixture dependencies of attributes [33]. The resulting model which combines instance weighting with the improved HNB model into one uniform framework inherits the effectiveness of HNB, and reflects different influences of different instances. We propose the novel hybrid model which combines instance weighting with the improved HNB model into one uniform framework, referred to as instance weighted hidden naive Bayes (IWHNB). In our IWHNB approach, the improved HNB model is proposed to approximate the groundtruth attribute dependencies. The IWHNB approach is an improved approach which combines instance weighting with the improved HNB model into one uniform framework. Some training time is spent to calculate the weight of each instance, the experimental results show that our proposed IWHNB approach is still simple and efficient.

Structure Extension
Instance Weighting
Instance Weighted Hidden Naive Bayes
The Instance Weighted Hidden Naive Bayes Model
The Weight of Each Instance
Experiments and Results
Conclusions and Future Work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call