Abstract

Open set recognition requires models to recognize samples of known classes learned in the training set while reject unknowns not learned. Compared with the structural risk minimization theory for closed-set problems, structural risk in open set tasks remains rarely explored. In this paper, we point out that balancing between structural risk and open space risk is crucial for open set recognition, and re-formalize it as open set structural risk. This brings a new view towards the general relationship between closed set recognition and open set recognition against the common intuition, which argues that a good closed set classifier always benefits for open set recognition. Specifically, we theoretically and experimentally show that recent mix-based data augmentation methods are aggressive closed set regularization methods, which reduce structural risk at cost of sacrificing open space risk. Besides, we show that existing negative data augmentation designed for open space risk reduction also ignore the trade-off problem between structural risk and open space risk, which limits their performance. We propose an efficient negative data augmentation strategy named self-mix and a corresponding method named OpenMix. OpenMix generates high-quality negative samples by mixing samples themselves, which can take care of both risks simultaneously. When combining OpenMix with conservative closed set regularization methods to form OpenMix+, models can achieve lower open set structural risk. Extensive experiments validate the superiority of OpenMix and OpenMix+ in terms of both effectiveness and universality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call