Abstract

Support vector data description (SVDD) is a leading classification method for novelty detection, which minimizes the volume of a spherically shaped decision boundary around the normal class. While SVDD has achieved promising performance, it will lead to a loose boundary for multivariate datasets of which the input dimensions are usually correlated. Inspired by the relationship between kernel principal component analysis (kernel PCA) and the best-fit ellipsoid for a dataset, this study proposes the ellipsoidal data description (ELPDD) which considers feature variance of each dimension adaptively. A minimum volume enclosing ellipsoid (MVEE) is constructed around the target data in the kernel PCA subspace which can be learned via a SVM-like objective function with log-determinant penalty. We also provide the Rademacher complexity bound for our model. Some relating problems are investigated in detail. Experiments on artificial and real-world datasets validate the effectiveness of our method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.