Abstract

We propose a new nonparametric procedure to solve the problem of classifying objects represented by $d$-dimensional vectors into $K \geq 2$ groups. The newly proposed classifier was inspired by the $k$ nearest neighbour (kNN) method. It is based on the idea of a depth-based distributional neighbourhood and is called $k$ nearest depth neighbours (kNDN) classifier. The kNDN classifier has several desirable properties: in contrast to the classical kNN, it can utilize global properties of the considered distributions (symmetry). In contrast to the maximal depth classifier and related classifiers, it does not have problems with classification when the considered distributions differ in dispersion or have unequal priors. The kNDN classifier is compared to several depth-based classifiers as well as the classical kNN method in a simulation study. According to the average misclassification rates, it is comparable to the best current depth-based classifiers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call