Abstract

Improving Accuracy of Out-of-Distribution Detection and In-Distribution Classification by Incorporating JSD Consistency Loss

Highlights

  • Deep learning has garnered considerable attention owing to its ability to make accurate decisions based on images and other data and achieve state-of-the-art (SoTA) results in diverse tasks

  • We demonstrated that the incorporation of Jensen–Shannon divergence (JSD) consistency loss in training the rot predictor improves the performance of representation learning

  • SimSiam and SimCLRs both exhibit the best performance in representation learning when the proposed method was introduced

Read more

Summary

Introduction

Deep learning has garnered considerable attention owing to its ability to make accurate decisions based on images and other data and achieve state-of-the-art (SoTA) results in diverse tasks. For image classification tasks where sufficiently large datasets are available, AlexNet [1], VGG [2], ResNet [3], WideResNet [4], and EfficientNet [5] surpass all other non-deep learning methods. These deep learning classifiers achieved high in-distribution (IN-D) classification accuracy. Suppose a classifier that solves the task of classifying ten types of dogs is given an image of a cat In such cases, the classifier typically assigns a high probability to one of the classes, even though the image does not fit into any of the classes.

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call