Abstract

In the context of multi-label text classification (MLTC), Binary Relevance (BR) stands out as one of the most intuitive and frequently employed methodologies. It tackles the MLTC task by breaking it down into multiple binary classification problems. However, BR has faced conceptual criticism due to its omission of label dependency information. To address this limitation, numerous studies have concentrated their efforts on enhancing the incorporation of label dependencies and document features. This resulted in substantial improvements in the performance of MLTC models. While the question of whether models incorporating label dependency information consistently outperform BR models remains unanswered, the prevailing opinion suggests their superiority. In this paper, we present evidence that challenges this widely held belief. Our numerical results across various text datasets demonstrate that an optimized binary relevance convolutional neural network (BR-CNN) can outperform advanced multi-label learning models explicitly designed to leverage label dependency information as well as advanced Binary Relevance (BR) models. Our result underscores the competitiveness of a BR-CNN approach for MLTC and emphasizes the versatility of the BR model family as a customizable option. More fundamentally, our findings contribute to the ongoing discourse surrounding label dependency and provide valuable insights into the efficacy of the binary relevance approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call