Abstract

Multi-label learning has been received much attention due to its applicability in machine learning problems. In current years, quite few approaches based on either extreme learning machine (ELM) or radial basis function (RBF) neural network have been proposed with the aim of increasing the efficiency of the multi-label classification. Most existing multi-label learning algorithms focus on information about the feature space. In this paper, our major intention is to regularize the objective function of multi-label learning methods via Locally Linear Embedding (LLE). To achieve this goal, two neural network architectures namely Multi-Label RBF (ML-RBF) and Multi-Label Multi Layer ELM (ML-ELM) are utilized. Then, a regularized multi-label learning method via feature manifold learning (RMLFM) and a regularized multi-label learning method via dual-manifold learning (RMLDM) are established for training two network structures. RMLDM simultaneously exploits the geometry structure of both feature and data space. Furthermore, eight different configurations of applying training algorithms (i.e., RMLFM and RMLDM) to model architectures (i.e., ML-RBF and ML-ELM) are considered for conducting comparisons. The validity and effectiveness of these eight classifiers are indicated by a number of experimental studies on several multi-label datasets. Furthermore, the experiments indicate that the efficiency of the classification can be improved considerably against some cutting-the-edge multi-label techniques for the neural classifiers in which the dual-manifold learning is used as the training method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call