Abstract

Deep neural networks (DNNs) are very efficient at object recognition but sometimes have an overestimated reliability for erroneous decisions. It is therefore very important to check if the distribution of images to be processed matches that of the training set. This has been studied as out-of-distribution (OoD) detection. This paper proposes an algorithm allowing classification as well as OoD detection, based on the generative classifier (GC) paradigm.While a discriminative classifier (DC) learns p(y|x) to achieve classification and a generative network learns p(x) to achieve OoD detection, the GC learns p(x|y) to achieve classification as well as OoD detection. Since generative models such as variational auto-encoders (VAEs) show good performance for unsupervised OoD detection, we propose to structure the latent variable of a VAE with a gaussian mixture model (GMM) while building a GC. The proposed model <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sup> is shown to outperform classical approaches based on DCs for far OoD and generative adversarial network (GAN) based models for near OoD without losing any performance for the classification task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call