Abstract

AbstractThe last decade has seen an increase in the attention paid to the development of cost‐sensitive learning algorithms that aim to minimize misclassification costs while still maintaining accuracy. Most of this attention has been on cost‐sensitive decision tree learning, whereas relatively little attention has been paid to assess if it is possible to develop better cost‐sensitive classifiers based on Bayesian networks. Hence, this paper presents EBNO, an algorithm that utilizes Genetic algorithms to learn cost‐sensitive Bayesian networks, where genes are utilized to represent the links between the nodes in Bayesian networks and the expected cost is used as a fitness function. An empirical comparison of the new algorithm has been carried out with respect to (a) an algorithm that induces cost‐insensitive Bayesian networks to provide a base line, (b) ICET, a well‐known algorithm that uses Genetic algorithms to induce cost‐sensitive decision trees, (c) use of MetaCost to induce cost‐sensitive Bayesian networks via bagging (d) use of AdaBoost to induce cost‐sensitive Bayesian networks, and (e) use of XGBoost, a gradient boosting algorithm, to induce cost‐sensitive decision trees. An empirical evaluation on 28 data sets reveals that EBNO performs well in comparison with the algorithms that produce single interpretable models and performs just as well as algorithms that use bagging and boosting methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.