Abstract

In conventional support vector machines (SVMs), an n-class problem is converted into n two-class problems. For the i-th two-class problem we determine the optimal decision function which separates class i from the remaining classes. In classification, a sample is classified into class i only when the value of the i-th decision function is positive. In this architecture, the sample is unclassifiable if the values of more than two decision functions are positive or all the values are negative. In this paper, to overcome this problem, we propose fuzzy support vector machines (FSVMs). Using the decision functions obtained by training the SVM, for each class, we define a truncated polyhedral pyramidal membership function. Since, for the data in the classifiable regions. the classification results are the same for the two methods, the generalization ability of the FSVM is the same with or better than that of the SVM. We evaluate our method for three benchmark data sets and demonstrate the superiority of the FSVM over the SVM.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.