Abstract

• We propose a parameter-free instance-based classifier , the HSP classifier. • Our accuracy outperforms kNN while keeping its simplicity and complexity. • We also present a probabilistic and asymptotic versions of the HSP improving both the time and the original proposal’s accuracy. • The same weigthing methods that improves kNN, also improves the HSP and its asymptotic variants The primary example of instance-based learning is the k -nearest neighbor rule (kNN), praised for its simplicity and the capacity to adapt to new unseen data and toss away old data. The main disadvantages often mentioned are the classification complexity , which is O ( n ) , and the estimation of the parameter k , the number of nearest neighbors to be used. The use of indexes at classification time lifts the former disadvantage, while there is no conclusive method for the latter. This paper presents a parameter-free instance-based learning algorithm using the Half-Space Proximal (HSP) graph. The HSP neighbors simultaneously possess proximity and variety concerning the center node. To classify a given query, we compute its HSP neighbors and apply a simple majority rule over them. In our experiments, the resulting classifier bettered K N N for any k in a battery of datasets. This improvement sticks even when applying weighted majority rules to both kNN and HSP classifiers. Surprisingly, when using a probabilistic index to approximate the HSP graph and consequently speeding-up the classification task , our method could improve its accuracy in stark contrast with the kNN classifier, which worsens with a probabilistic index.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call