In this paper, a new support vector machine, ESVM, with more emphasis on constraints is presented. The constraints are fuzzy inequalities. With this scheme, two problems are solved: training samples with some degree of uncertainty, and samples with tolerance. Also, the fuzzy SVM (FSVM) model is modified with emphasis constraints. The new model is called fuzzy ESVM (FESVM), in this paper. With this scheme we will able to consider importance degree for samples both in the cost function and constraints, simultaneously. Necessary experiments are performed and the results show the superiority of the proposed methods. ESVM and fuzzy ESVM are strongly recommended to the researchers who work on data sets with noisy or low degree of certainty samples. Support vector machines (SVMs) are one of the most powerful methods that deliver state-of-the-art performance in real world pattern recognition and data mining applications. In the SVM solution, a pattern recognition problem is converted to a constraint quadratic programming. Support vector machine is originally introduced by Vapnik (1) within the area of statistical learning theory. It is bases on structural risk minimization (SRM) principle and finds a classifier with minimized Vapnik-Chervonenkis (VC) dimension. In pattern recognition, SVMs have been developed for data classification, feature reduction, and function estimation (2,3). The SVM classifier is very sensitive to outliers and noisy samples since the penalty term of SVM treats every data point equally in the training process. Increasing the values of slack variables, help in reducing the effect of noisy support vectors. Without introducing these slack variables, in the presence of noisy data, SVM may not be able to determine a hyperplane between two classes. The generalized formulation of the SVM model which can tolerate noisy data close to separating hyperplane is as follows:
Read full abstract