Abstract

When building a performing one-class classifier, the low variance direction of the training data set might provide important information. The low variance direction of the training data set improves the Covariance-guided One-Class Support Vector Machine (COSVM), resulting in better accuracy. However, this classifier does not use data dispersion in the one class. It explicitly does not make use of target class subclass information. As a solution, we propose Scatter Covariance-guided One-Class Support Vector Machine, a novel variation of the COSVM classifier (SC-OSVM). In the kernel space, our approach makes use of subclass information to jointly decrease dispersion. Our algorithm technique is even based on a convex optimization problem that can be efficiently solved using standard numerical methods. A comparison of artificial and real-world data sets shows that SC-OSVM provides more efficient and robust solutions than normal COSVM and other contemporary one-class classifiers.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.