AbstractThe nonparametric multivariate analysis of variance (NPMANOVA) testing procedure has been proven to be a valuable tool for comparing groups. In the present paper, we propose a kernel extension of this technique in order to effectively confront high-dimensionality, a recurrent problem in many fields of science. The new method is called kernel multivariate analysis of variance (KMANOVA). The basic idea is to take advantage of the kernel framework: we propose to project the data from the original data space to a Hilbert space generated by a given kernel function and then perform the NPMANOVA method in the reproducing kernel Hilbert space (RKHS). Dispersion of the embedded points can be measured by the distance induced by the inner product in the RKHS but also by many other distances best suited in high-dimensional settings. For this purpose, we study two promising distances: a Manhattan-type distance and a distance based on an orthogonal projection of the embedded points in the direction of the group centroids. We show that the NPMANOVA method and the KMANOVA method with the induced distance are essentially equivalent. We also show that the KMANOVA method with the other two distances performs considerably better than the NPMANOVA method. We illustrate the advantages of our approach in the context of genetic association studies and demonstrate its usefulness on Alzheimer’s disease data. We also provide a software implementation of the method that is available on GitHub https://github.com/8699vicente/Kmanova.
Read full abstract