Abstract
The linear 2-norm Support Vector Machine (L2-SVM) builds a hyper-plane which maximizes the 2-norm soft margin. Random projection is a new oblivious feature extraction and dimension reduction method. Both techniques are widely applied in compressed sensing, texture classification, face recognition, and so on. This paper find that random projection can be applied to any input matrix of L2-SVM. Furthermore, we proves that the geometric margin and the minimum enclosing ball in the projected space are almost unchanged with high probability compared with those in the original space. The result is demonstrated by experiments on synthetic and real data. Computational experiments also show that the proposed random projection for L2-SVM has a better performance than random projection for L1-SVM. Besides, the proposed model can solve the large scale classification problems more effectively and efficiently. Moreover, the proposed method is compared with Principal Component Analysis and is applied in Corel images classification problems.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have