Abstract

Similar to binary and multi-class classifiers, one-class classifiers have to face the difficulty of ’curse of dimensionality’ when they are applied to deal with high-dimensional samples. As an efficient dimensionality reduction method, sparse coding tries to learn a set of over-complete bases to represent the given samples. It can effectively overcome the ’curse of dimensionality’ problem. However, the traditional sparse coding only fit for tackling Gaussian noise. When the noise within the given set of samples obey non-Gaussian distribution, the conventional sparse coding cannot obtain accurate coefficient vectors. To make sparse coding more fit for dealing with non-Gaussian noise and enhance the sparseness of the obtained coefficient vectors, correntropy is utilized to substitute its reconstruction error term and logarithmic penalty function is introduced as its regularization term. Furthermore, the obtained sparse coefficient vectors are used as the input vectors for one-class support vector machine (OCSVM). Experimental results on twenty UCI benchmark data sets and one handwritten digit data set demonstrate that the proposed method achieves better anti-noise and generalization abilities in comparison with its related approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call