Abstract

AbstractMulti‐label classification algorithms deal with classification problems where a single datapoint can be classified (or labelled) with more than one class (or label) at the same time. Early multi‐label approaches like binary relevance consider each label individually and train individual binary classifier models for each label. State‐of‐the‐art algorithms like RAkEL, classifier chains, calibrated label ranking, IBLR‐ML+, and BPMLL also consider the associations between labels for improved performance. Like most machine learning algorithms, however, these approaches require careful hyper‐parameter tuning, a computationally expensive optimisation problem. There is a scarcity of multi‐label classification algorithms that require minimal hyper‐parameter tuning. This paper addresses this gap in the literature by proposing CascadeML, a multi‐label classification method based on the existing cascaded neural network architecture, which also takes label associations into consideration. CascadeML grows a neural network architecture incrementally (deep as well as wide) in a two‐phase process as it learns network weights using an adaptive first‐order gradient descent algorithm. This omits the requirement of preselecting the number of hidden layers, nodes, activation functions, and learning rate. The performance of the CascadeML algorithm was evaluated using 13 multi‐label datasets and compared with nine existing multi‐label algorithms. The results show that CascadeML achieved the best average rank over the datasets, performed better than BPMLL (one of the earliest well known multi‐label specific neural network algorithms), and was similar to the state‐of‐the‐art classifier chains and RAkEL algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.