Abstract

Deep hashing has proven to be efficient and effective for large-scale face retrieval. However, existing hashing methods are designed for normal face images only. They fail to consider the fact that face images may be occluded because of wearing masks, hats, glasses, etc. Retrieval performance of existing face retrieval methods is much worse when dealing with occluded face images. In this work, we propose the knowledge distillation hashing (KDH) to deal with occluded face images. The KDH is a two-stage learning approach with teacher-student model distillation. We first train a teacher hashing network using normal face images and then the knowledge from teacher model is used to guide the optimization of the student model using occluded face images as input only. With knowledge distillation, we build a connection between imperfect face information and the optimal hash codes. Experimental results show that the KDH yields significant improvements and better retrieval performance in comparison to existing state-of-the-art deep hashing retrieval methods under six different face occlusion situations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.