Abstract

In the last decade, SVM-based one-class classifier is well explored for anomaly detection using single-kernel learning. Recently, it is also explored for multiple-kernel learning using a fixed combination of weight where same weight is assigned to each kernel over the complete input space. In this paper, an SVM-based one-class classifier (i.e. Support Vector Data Descriptor (SVDD)) is adapted for localized multi-kernel learning and referred to LMSVDD. Here, the present locality in the input space is considered as the deciding factor to assign different weights to a kernel for different regions of the input space. This localization has been achieved by using a gating function which is trained in tandem with an SVDD-based one-class classifier. Localization also helps in achieving the sparser solution compared to existing multi-kernel-based method as it uses less number of support vectors in many cases. The performance is evaluated based on the extensive experiment over 23 benchmark datasets from various disciplines and compared LMSVDD with 6 state-of-the-art kernel-based methods. LMSVDD outperformed existing single and multi-kernel based methods and the results have been also statistically verified using a Friedman test.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call