Abstract

The k_n nearest neighbor classification is a nonparametric classification procedure that assigns a random vector Z to one of two populations \pi_1, \pi_2 . Samples of equal size n are taken from \pi_1 and \pi_2 and are ordered separately with respect to their distance from Z = z . The assigns Z to \pi_1 if the distance of the k_n th sample observation from \pi_1 to z is less than the distance of the k_n th sample observation from \pi_2 to z ; otherwise Z is assigned to \pi_2 . This is equivalent to the Fix and Hodges, majority rule [4] or the nearest neighbor of Cover and Hart [3]. This paper studies some asymptotic properties of this including an expression for a consistent upper bound on the probability of misclassification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call