Abstract

Driven by the recent technological advancements within the machine learning field, multi-label classification has been introduced as one of the challenging tasks to assign more than one label to each instance in a dataset. Feature selection is one of the predominant feature engineering methodologies which being extensively used as a vital step in predictive model construction to enhance the multi-label classification performance. Many metaheuristic algorithms have been tailored to choose the optimal subset of features in datasets but as a challenging problem, such algorithms suffer from a slow process during fine-tuning. Objective of this paper is to propose a hybrid mechanism by which an obtained feature subset from a Binary Differential Evolution (BDE) algorithm will be further enhanced to minimize the classification error using a local search methodology. Key motivation behind the proposed model is to address the weakness in exploitation of metaheuristic feature selection algorithms with the help of classical feature selection method such as Sequential Backward Selection (SBS) as a local search strategy. The classical feature selection method eliminates more redundant and irrelevant features of obtained subset using the BDE to decrease the classification error. The empirical results obtained on eight various multi-label datasets show that the proposed hybrid approach, which is a fusion of both evolutionary and classical feature selection methods, can minimize the classification error on the obtained feature subset using the BDE.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call