Abstract

This paper aims to provide a compact but accessible introduction to Conformal Predictors (CP), a Machine Learning method with the distinguishing property of producing predictions that exhibit a chosen error rate. This property, referred to as validity, is backed by not only asymptotic, but also finite-sample probabilistic guarantees. CPs differ from the conventional approach to prediction in that they introduce hedging in the form of set-valued predictions. The CP validity guarantees do not require assumptions such as priors, but are of broad applicability as they rely solely on exchangeability. The CP framework is universal in the sense that it operates on top of virtually any Machine Learning method. In addition to the formal definition, this introduction discusses CP variants that can be computed efficiently (Inductive or “split” CP) or that are suitable for imbalanced data sets (class-conditional CP). Finally, a short survey of the field provides references for relevant research and highlights the variety of domains in which CPs have found valuable application.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call