Monte Carlo simulations and theoretical analyses have repeatedly demonstrated the impact of outliers on statistical analysis. Most simulation studies generate outliers using one of two general approaches: by multiplying an arbitrary point by a constant or through a finite mixture. The latter can be extended to multivariate settings by defining the Mahalanobis distance between the centroids of two clusters of points. Nevertheless, when researchers aim to simulate individual data points with population-level Mahalanobis distances, the number of available procedures is very limited. This article generalizes one of the few existing methods to simulate an arbitrary number of outliers in an arbitrary number of dimensions, for both multivariate normal and non-normal data. A small simulation demonstration showcases how this methodology enables new simulation designs that were either unpopular or not possible due to the lack of a data-generating algorithm. A discussion of potential implications highlights the importance of considering multivariate outliers in simulation settings.
Read full abstract