Abstract

Dropout and DropConnect are useful methods to prevent multilayer neural networks from overfitting. In addition, it turns out that these tools can also be used to estimate the stability of networks regarding disturbances. Prototype based networks gain more and more attraction in current research because of their inherent interpretability and robust behavior. Popular prototype-based classifiers are support vector machines and the heuristically motivated Learning Vector Quantizer (LVQ). The Generalized Matrix LVQ (GMLVQ) is an extension of LVQ which can be interpreted as a special multilayer network containing a projection and a prototype layer. First in this paper, we extend the linear projection layer of GMLVQ to a non-linear mapping by employing different non-linear activations functions. Second, we compare the classification decision stabilities of the linear and the non-linear GMLVQ regarding DropConnect while taking the neural network perspective. Thus we can adopt DropConnect ideas known from multilayer perceptron learning to investigate stability and robustness of GMLVQ. To this end, the evaluation of the stability is done in terms of a information theoretic stability measure based on the Shannon-Entropy. We demonstrate the approach for three real world data sets from Raman spectroscopy, multi-spectral remote sensing and the well-known MNIST data set.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call