Abstract

The proposal of efficient classification methods is often required when common conditions (like additivity and normal stochastic behaviour) are not satisfied. Three classical classifiers are the Linear Discriminant Analysis (LDA), K Nearest Neighbours (KNN) and Quadratic Discriminant Analysis (QDA) methods. It is known that the performance of these techniques is strongly affected by the absence of linearity in the separation between/among two or more multivariate data classes. In this paper, we propose semiparametric classification methods that can be lesser sensible to the previous phenomenon. Our classifiers are based on LDA and KNN methods combined with Rényi and Kullback–Leibler stochastic divergences. The performance of various classifiers is evaluated on both normal and non-normal simulated data Further, they are applied to classify multidimensional features of synthetic aperture radar images, which have a multiplicative and non-normal nature due to the speckle noise presence. Results from empirical and real data furnish evidence that proposed methods can provide classification with smaller error rates than the classical LDA, KNN and QDA classification procedures. In particular, the optimum performance happens when order parameter tends to 0.5 for simulated data, while the best classification is achieved at order around 0.95 for real data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call