Abstract

Particle Swarm Optimization is an optimization algorithm that mimics the behaviour of a flock of birds, setting multiple particles that explore the search space guided by a fitness function in order to find the best possible solution. We apply the Sticky Binary Particle Swarm Optimization algorithm to perform feature selection for domain adaptation, a specific type of transfer learning in which the source and the target domain have a common feature space, a common task, but different distributions. When applying Particle Swarm Optimization, classification error is usually employed in the fitness function to evaluate the goodness of subsets of features. In this paper, we aim to compare this approach with using complexity metrics instead, under the assumption that reducing the complexity of the problem will lead to results that are independent from the classifier used for testing while being less computationally demanding. Therefore, we carried out experiments to compare the performance of both approaches in terms of classification accuracy, speed and number of features selected. We found out that our proposal, although in some cases incurs in a slight degradation of classification performance, it is indeed faster and selects fewer features, making it a feasible trade-off.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call