Abstract

As a main branch of weakly supervised learning paradigm, partial label learning (PLL) copes with the situation where each sample corresponds to ambiguous candidate labels containing the unknown true label. The primary difficulty of PLL lies in label ambiguities, most existing researches focus on individual instance knowledge while ignore the importance of cross-sample knowledge. To circumvent this difficulty, an innovative multi-task framework is proposed in this work to integrate self-supervision and self-distillation to tackle PLL problem. Specifically, in the self-distillation task, cross-sample knowledge in the same batch is utilized to refine ensembled soft targets to supervise the distillation operation without using multiple networks. The auxiliary self-supervised task of recognizing rotation transformations of images provides more supervisory signal for feature learning. Overall, training supervision is constructed not only from the input data itself but also from other instances within the same batch. Empirical results on benchmark datasets reveal that this method is effective in learning from partially labeled data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.