Abstract

Classification is an essential task in the field of Machine Learning, where developing a classifier that minimizes errors on unknown data is one of its central problems. It is known that the analytic center is a good approximation of the center of mass of the version space that is consistent with the Bayes-optimal decision surface. Therefore, in this work, we propose an evolutionary algorithm, relying on the convexity properties of the version space, that evolves a population of perceptron classifiers in order to find a solution that approximates its analytic center. Hyperspherical coordinates are used to guarantee feasibility when generating new individuals and enabling exploration to be uniformly distributed through the search space. To evaluate the individuals we consider using a potential function that employs a logarithmic barrier penalty. Experiments were performed on real datasets, and the obtained results indicate concrete possibilities for applying the proposed algorithm for solving practical problems.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.