Abstract
Support vector machine (SVM) is a popular machine learning technique, and it has been widely applied in many real-world applications. Since SVM is sensitive to outliers or noises in the dataset, Fuzzy SVM (FSVM) has been proposed. Like SVM, it still aims at finding an optimal hyperplane that can separate two classes with the maximal margin. The only difference is that fuzzy membership is assigned to each training point based on its importance, which makes it less sensitive to outliers or noises to some extent. However, FSVM ignores an important prior knowledge, the within-class structure. In this paper, we propose a new classification algorithm-FSVM with minimum within-class scatter (WCS-FSVM), which incorporates minimum within-class scatter in Fisher Discriminant Analysis (FDA) into FSVM. The main idea is that an optimal hyperplane is found such that the margin is maximized while the within-class scatter is kept as small as possible. In addition, we propose a new fuzzy membership function for WCS-FSVM. Experiments on six benchmarking datasets and four artificial datasets show that our proposed WCS-FSVM algorithm can not only improve the classification accuracy and generalization ability but also handle the classification problems with outliers or noises more effectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
More From: Neurocomputing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.