Abstract

• We propose a joint training methodology for knowledge distillation and open set recognition to improve model robustness. • We demonstrate how knowledge distillation (KD) and open set recognition (OSR) can be performed on 3D point cloud data • The proposed method provides a much smaller model which additionally has enhanced OSR capabilities than the original model. • We find that KD by itself can transfer OSR ability to a student model apart from previously known dark knowledge transfer. • We show how there is a trade-off that comes into play for the student model’s open-set and closed-set performance. Real-world scenarios pose several challenges to deep learning based computer vision techniques despite their tremendous success in research. Deeper models provide better performance, but are challenging to deploy and knowledge distillation allows us to train smaller models with minimal loss in performance. A model also has to deal with open set samples from classes outside the ones it was trained on and should be able to identify them as unknown samples while classifying the known ones correctly. Finally, most existing image recognition research focuses only on using two-dimensional snapshots of the three-dimensional real world objects. In this work, we attempt to bridge these three research fields, which have been developed independently until now, despite being deeply interrelated in practice. We propose a joint knowledge distillation and open set recognition training methodology for three-dimensional object recognition. We demonstrate the effectiveness of the proposed method via various experiments on how it allows us to obtain a much smaller model, which takes a minimal hit in performance while being capable of open set recognition for 3D point cloud data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.