Abstract

In instance-based classifiers, there is a need for storing a large number of samples as a training set. In this paper, we propose a large symmetric margin instance selection algorithm, namely LAMIS. LAMIS removes non-border (interior) instances and keeps border ones. This paper presents an instance selection process through formulating it as a constrained binary optimization problem and solves it by employment filled function algorithm. Instance-based learning algorithms are often confronted with the problem of deciding which instances must be stored for use during an actual test. Storing too many instances can result in large memory requirements and slow execution. In LAMIS, the core of instance selection process is based on keeping the hyperplane that separates a two-class data, to provide large margin separation. LAMIS selects the most representative instances, satisfying both objectives: high accuracy and reduction rates. The performance has been evaluated on real world data sets from UCI repository by the ten-fold cross-validation method. The results of experiments have been compared with state-of-the-art methods, where the overall results, show the superiority of the proposed method in terms of classification accuracy and reduction percentage.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.