Abstract

Determining outliers is more complicated in multivariate data sets than it is in univariate cases. The aim of this study is to evaluate the blocked adaptive computationally efficient outlier nominators (BACON) algorithm, the fast minimum covariance determinant (FAST-MCD) method, and the robust Mahalanobis distance (RM) method in multivariate data sets. For this purpose, outlier detection methods were compared for multivariate normal, Laplace, and Cauchy distributions with different sample sizes and numbers of variables. False-negative and false-positive ratios were used to evaluate the methods’ performance. The results of this work indicate that the performance of these methods varies according to the distribution type.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call