Abstract

Abstract For a classification problem, the nonlinearly separable case can be transformed into a linearly separable one by using the kernel trick, However, introducing the single kernel function to a classifier can cause the difficulty of feature selection and it reduces the model interpretability, which is very important for some practical applications. Moreover, multi-kernel learning used for improving the predictive accuracy can lead to the poor efficiency and interpretability, especially for large-scale high-dimensional classification problems. In this paper, we proposed a novel sparse feature kernel multi-criteria linear programming classifier (SFK-MCLPC), the two-stage classifier model employs new row and column kernel matrices of different feature kernels to iteratively solve two associative linear programming problems so as to identify important features from data in the process of classification. Based on the ten real-world data sets, the experimental results and comparison with multi-criteria linear and quadratic programming classifiers (MCLPC, MCQPC), support vector classifier (SVC), and multiple kernel learning SVC (MKL-SVC) have shown that the proposed SFK-MCLPC can enhance the predictive accuracy of different classes, the efficiency of classification, the interpretability of the pattern analysis, and the generalization ability of predicting the classes of unseen instances by selecting the relevant and important features.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call