Abstract

Out-of-distribution (OOD) detection is related to the security and stability of deep learning models deployed in the real world. The existing OOD detection algorithms based on the neural network normally use a single scoring function to detect out-of-distribution examples, which start from the posterior probability and do not fully utilize the information of the pre-trained model. In this paper, based on our previous PEDCC-based work, feature fusion is explored in OOD detection to take maximum advantage of the pre-trained classifier features. Our improved method adopts a two-stage training approach, in which multiple OOD detection features of the first-stage neural network classifier are extracted as the input of the second-stage training. In addition, we propose the stop-near-saturation method, which can help the OOD detection algorithm find optimal network parameters without accessing OOD data. Extensive experiments on several public datasets and classification networks show that compared with other existing methods, this method has better OOD detection performance, and maintains the low computational complexity of the original PEDCC-based method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.