Abstract

This paper proposes a method to segment plant leaves using knowledge distillation. Unlike the existing knowledge distillation method aimed at lightening the model, the architectures of the teacher and student networks are kept identical. Plants have many leaves, and each leaf is very small. To segment each plant leaf well, clustering is used through spatial embedding. The teacher and student networks perform segmentation based on spatial embedding. The teacher network is trained with a large dataset and then distills its segmentation knowledge into the student network. Two types of knowledge are distilled from the teacher network: feature distillation and attention distillation. The results of the experiment demonstrate that better instance segmentation can be achieved when using knowledge distillation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call