Abstract

We consider the problem of binary classification with the caveat that the learner can abstain from declaring a label incurring a cost λ ∈ [0,1/2] in the process. This is referred to as the problem of binary classification with a fixed-cost of abstention. For this problem, we propose an active learning strategy that constructs a non-uniform partition of the input space and focuses sampling in the regions near the decision boundaries. Our proposed algorithm can work in all the commonly used active learning query models, namely membership-query, pool-based and stream-based. We obtain an upper bound on the excess risk of our proposed algorithm under standard smoothness and margin assumptions and demonstrate its minimax near-optimality by deriving a matching (modulo poly-logarithmic factors) lower bound. The achieved minimax rates are always faster than the corresponding rates in the passive setting, and furthermore the improvement increases with larger values of the smoothness and margin parameters.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.