Abstract

We present a new unsupervised dimensionality reduction technique, called LN-SNE, for anomaly detection. LN-SNE generates a parametric embedding by means of Restricted Boltzmann Machines and uses a heavy-tail distribution to project data to a lower dimensional space such that dissimilarities between normal data and anomalies are preserved or strengthened. We compare LN-SNE to several benchmark dimensionality reduction methods on real datasets. The results suggest that LN-SNE for anomaly detection is less sensitive to the dimension of the latent space than the other methods and outperforms them in terms of accuracy. We empirically show that our technique scales near-linearly with respect to the number of dimensions and data size.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call