Abstract

Self-distillation method guides the model learning via transferring knowledge of the model itself, which has shown the advantages in object segmentation. However, it has been proved that uncertain pixels with predicted probability close to 0.5 will restrict the model performance. The existing self-distillation methods cannot guide the model to enhance its learning ability for uncertain pixels, so the improvement is limited. To boost the student model’s learning ability for uncertain pixels, a novel self-distillation method is proposed. Firstly, the predicted probability in the current training sample and the ground truth label are fused to construct the teacher knowledge, as the current predicted information can express the performance of student models and represent the uncertainty of pixels more accurately. Secondly, a quadratic mapping function between the predicted probabilities of the teacher and student model is proposed. Theoretical analysis shows that the proposed method using the mapping function can guide the model to enhance the learning ability for uncertain pixels. Finally, the essential difference of utilizing the predicted probability of the student model in self-distillation is discussed in detail. Extensive experiments were conducted on models with convolutional neural networks and Transformer architectures as the backbone networks. The results on four public datasets demonstrate that the proposed method can effectively improve the student model performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.