Abstract

Classification is an essential task in the field of machine learning, where finding a maximum margin classifier is one of its central problems. In this work, an evolutionary algorithm is constructed, relying on the convexity properties of the version space, to evolve a population of perceptron classifiers in order to find a solution that approximates the maximum margin. Unlike other methods whose solutions explore the problem’s dual formulation, usually requiring the solution of a linear constraint quadratic programming problem, the proposed method requires only the evaluation of the margin values. Hyperspherical coordinates are used to guarantee feasibility when generating new individuals and for the population to be uniformly distributed through the search space. To control the number of generations, we developed a stop criteria based on a lower bound function which asymptotically approximates the margin curves providing a stop margin that satisfies a $$\beta $$ -approximation of the optimal margin. Experiments were performed on artificial and real datasets, and the obtained results indicate the potential to adopt the proposed algorithm for solving practical problems.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.