Abstract

Long-tailed classifications make it very challenging to deal with class-imbalanced problems using deep convolutional neural networks (CNNs). Existing solutions based on re-balancing methods perform well and use single-task CNNs to train each fine-grained class independently. However, classification tasks are multiplex and involve a coarse-to-fine hierarchical relation. In this paper, we propose a coarse-to-fine knowledge transfer based multi-task CNN, which utilizes the coarse-to-fine structure to promote long-tailed learning. First, we construct a tail hierarchical structure in a coarse-to-fine way to pay greater attention to tail classes than head classes. Second, a multi-task CNN is adopted to simultaneously train coarse- and fine-grained tasks to extract a more generalized knowledge representation than the single-task CNN. Third, we design a coarse-to-fine knowledge transfer strategy to adaptively adjust the task weights to improve fine-grained performance. Extensive experiments on benchmark datasets show that our model achieves better gains than the re-balancing methods. In particular, the proposed model is 3.25% more accurate than the second-best method on the long-tailed tieredImageNet dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call