Abstract

Classifiers sometimes return a set of values of the class variable since there is not enough information to point to a single class value. These classifiers are known as imprecise classifiers. Decision Trees for Imprecise Classification were proposed and adapted to consider the error costs when classifying new instances. In this work, we present a new cost-sensitive Decision Tree for Imprecise Classification that considers the error costs by weighting instances, also considering such costs in the tree building process. Our proposed method uses the Nonparametric Predictive Inference Model, a nonparametric model that does not assume previous knowledge about the data, unlike previous imprecise probabilities models. We show that our proposal might give more informative predictions than the existing cost-sensitive Decision Tree for Imprecise Classification. Experimental results reveal that, in Imprecise Classification, our proposed cost-sensitive Decision Tree significantly outperforms the one proposed so far; even though the cost of erroneous classifications is higher with our proposal, it tends to provide more informative predictions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call