Abstract

Using a variable ℓp≥1-norm penalty on the slacks, the recently introduced ℓp-norm Support Vector Data Description (ℓp-SVDD) method has improved the performance in novelty detection over the baseline approach, sometimes remarkably. This work extends this modelling formalism in multiple aspects. First, a large-margin extension of the ℓp-SVDD method is formulated to enhance generalisation capability by maximising the margin between the positive and negative samples. Second, based on the Frank–Wolfe algorithm, an efficient yet effective method with predictable accuracy is presented to optimise the convex objective function in the proposed method. Finally, it is illustrated that the proposed approach can effectively benefit from a multiple kernel learning scheme to achieve state-of-the-art performance.The proposed method is theoretically analysed using Rademacher complexities to link its classification error probability to the margin and experimentally evaluated on several datasets to demonstrate its merits against existing methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call