Abstract

Learning Vector Quantization (LVQ) and its cost-function-based variant called Generalized Learning Vector Quantization (GLVQ) are powerful, yet simple and interpretable Machine Learning (ML) algorithms for multi-class classification. Even though GLVQ is an effective tool for classifying vectorial data in its native form, it is not particularly suited to handle raw sequence data of potentially different lengths. Usually, this problem is addressed by manually engineering fixed-length features from the raw data or by employing recurrent networks. Therefore, a natural idea is to try to incorporate recurrent units for data processing into the GLVQ network structure. The processed data can then be compared in a latent space for classification decisions. Yet, not much work has been done in direction as far as we are aware. Existing methods to handle sequential data in the framework of GLVQ are rather unsophisticated and severely outdated. In this paper, we provide a general framework for incorporating recurrent structures in an LVQ network and derive two classification models as variants of Recurrent Learning Vector Quantization, namely RecLVQ and LVQRNN. We also demonstrate the abilities of such approaches on illustrative classification problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call