Abstract

Algorithmic decision making has proliferated and now impacts our daily lives in both mundane and consequential ways. Machine learning practitioners use a myriad of algorithms for predictive models in applications as diverse as movie recommendations, medical diagnoses, and parole recommendations without delving into the reasons driving specific predictive decisions. The algorithms in such applications are often chosen for their superior performance among a pool of competing algorithms; however, popular choices such as random forest and deep neural networks fail to provide an interpretable understanding of the model's predictions. In recent years, rule‐based algorithms have provided a valuable alternative to address this issue. Previous work established an or‐of‐and (disjunctive normal form) based classification technique that allows for classification rule mining of a single class in a binary classification. In this work, we extend this idea to provide classification rules for both classes simultaneously. That is, we provide a distinct set of rules for each of the positive and negative classes. We also present a novel and complete taxonomy of classifications that clearly capture and quantify the inherent ambiguity of noisy binary classifications in the real world. We show that this approach leads to a more granular formulation of the likelihood model and a simulated annealing‐based optimization achieves classification performance competitive with comparable techniques. We apply our method to synthetic and real‐world data sets for comparison with other related methods to demonstrate the utility of our contribution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call