Abstract

Long-tailed learning is attracting increasing attention due to the unbalanced distributions of real-world data. The aim is to train well-performing depth models. Traditional knowledge transfer methods for long-tailed learning are classified into feature-based horizontal knowledge transfer (HKT) and class-based vertical knowledge transfer (VKT). HKT transfers head-to-tail feature knowledge from different classes to improve classification performance when there are few tail classes. However, HKT easily leads to invalid transfer due to the deviation caused by the difference between the knowledge of head and tail classes. Fortunately, the class space has a multi-grained relationship and can form a multi-granularity knowledge graph (MGKG), which can be recast as coarse-grained and fine-grained losses to guide VKT. In this paper, we propose a hierarchical long-tailed classification method based on multi-granularity knowledge transfer (MGKT), which vertically transfers knowledge from coarse- to fine-grained classes. First, we exploit the semantic information of classes to construct an MGKG, which forms an affiliation of fine- and coarse-grained classes. Fine-grained knowledge can inherit coarse-grained knowledge to reduce transfer bias with the help of MGKG. We then propose a multi-scale feature fusion network, which aims to fully mine the rich information of the features to drive MGKT. Experiments show that the proposed model outperforms several state-of-the-art models in classifying long-tailed data. For example, our model performed 4.46% better than the next-best model on the SUN-LT dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call