Abstract

This paper explores how a kind of probabilistic systems, namely, Gaussian kernel density estimation (GKDE), can be used to interpret several classical kernel methods, including the well-known support vector machine (SVM), support vector regression (SVR), one-class kernel classifier, i.e., support vector data description (SVDD) or equivalently minimal enclosing ball (MEB), and the fuzzy systems (FS). For the SVM, we reveal that the classical SVM with Gaussian density kernel attempts to find a noisy GKDE based Bayesian classifier with equal prior probabilities for each class. For the SVR, the classification based e-SVR attempts to obtain two noisy GKDEs for each class in the constructed binary classification dataset, and the decision boundary just corresponds to the mapping function of the original regression problem. For the MEB or SVDD, we reveal the equivalence between it and the integrated-squared-errors (ISE) criterion based GKDE and by using this equivalence a MEB based classifier with privacy-preserving function is proposed for one kind of classification tasks where the datasets contain privacy-preserving clouds. For the FS, we show that the GKDE for a regression dataset is equivalent to the construction of a zero-order Takagi–Sugeno–Kang (TSK) fuzzy system based on the same dataset. Our extensive experiments confirm the obtained conclusions and demonstrated the effectiveness of the proposed new machine learning and modeling methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call