Abstract

Deep neural networks excel at learning patterns from finite training data but often provide incorrect predictions with high confidence while faced out-of-distribution data. In this work, we propose a data-agnostic framework called Stochastic Ghost Batch Augmentation (SGBA) to address these issues. It stochastically augments activation units at training iterations to amendment the model’s irregular prediction behaviors by leveraging the partial generalization ability of intermediate model, in which a self-distilled dynamic soft label as regularization term is introduced to establish the aforementioned lost connection, that incorporates the similarity prior in the vicinity distribution respect to raw samples, rather than conform model to static hard label. Also, the induced stochasticity can reduce much unnecessary, redundant computational cost in conventional batch augmentation performed at each pass. The proposed regularization provides direct supervision by the KL-Divergence between the output soft-max distribution of original and virtual data, and enforces the distribution matching to fuse the complementary information in the model’s prediction, which are becoming gradually mature and stable with the training process. In essence, it is a dynamic check or test about the generalization of neural network during training. Extensive performance evaluations demonstrate the superiority of our proposed framework.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.