Abstract
Existing methods based on transfer learning leverage auxiliary information to help tail generalization and improve the performance of the tail classes. However, they cannot fully exploit the relationships between auxiliary information and tail classes and bring irrelevant knowledge to the tail classes. To solve this problem, we propose a hierarchical CNN with knowledge complementation, which regards hierarchical relationships as auxiliary information and transfers relevant knowledge to tail classes. First, we integrate semantics and clustering relationships as hierarchical knowledge into the CNN to guide feature learning. Then, we design a complementary strategy to jointly exploit the two types of knowledge, where semantic knowledge acts as a prior dependence and clustering knowledge reduces the negative information caused by excessive semantic dependence (i.e., semantic gaps). In this way, the CNN facilitates the utilization of the two complementary hierarchical relationships and transfers useful knowledge to tail data to improve long-tailed classification accuracy. Experimental results on public benchmarks show that the proposed model outperforms existing methods. In particular, our model improves accuracy by 3.46% compared with the second-best method on the long-tailed tieredImageNet dataset.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have