Abstract

Classifying birds accurately is essential for ecological monitoring. In recent years, bird image classification has become an emerging method for bird recognition. However, the bird image classification task needs to face the challenges of high intraclass variance and low inter-class variance among birds, as well as low model efficiency. In this paper, we propose a fine-grained bird classification method based on attention and decoupled knowledge distillation. First of all, we propose an attention-guided data augmentation method. Specifically, the method obtains images of the object's key part regions through attention. It enables the model to learn and distinguish fine features. At the same time, based on the localization-recognition method, the bird category is predicted using the object image with finer features, which reduces the influence of background noise. In addition, we propose a model compression method of decoupled knowledge distillation. We distill the target and nontarget class knowledge separately to eliminate the influence of the target class prediction results on the transfer of the nontarget class knowledge. This approach achieves efficient model compression. With 67% fewer parameters and only 1.2 G of computation, the model proposed in this paper still has a 87.6% success rate, while improving the model inference speed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call