Abstract

We introduce confidence-weighted (CW) online learning algorithms for robust, cost-sensitive classification. Our work extends the original confidence-weighted optimization framework in two important directions. First, we show how the original value at risk (VaR) probabilistic constraint in CW algorithms can be generalized to a worst-case conditional value at risk (CVaR) constraint for more robust learning from cost weighted examples. Second, we show how to reduce adversarial feature noise, which can be useful in fraud detection scenarios, by reframing the optimization problem in terms of maximum a posteriori estimation. The resulting optimization problems can be solved efficiently. Experiments on real-world and synthetic datasets show that our robust, cost-sensitive extensions consistently reduce the cost incurred in both online and batch learning settings. We also demonstrate a correspondence between the VaR and CVaR constraints used for classification and uncertainty sets used in robust optimization, leading toward a rich family of potential extensions to CW algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call