Abstract

In this paper, we break with the traditional approach to classification, which is regarded as a form of supervised learning. We offer a method and algorithm, which make possible fully autonomous (unsupervised) detection of new classes, and learning following a very parsimonious training priming (few labeled data samples only). Moreover, new unknown classes may appear at a later stage and the proposed xClass method and algorithm are able to successfully discover this and learn from the data autonomously. Furthermore, the features (inputs to the classifier) are automatically sub-selected by the algorithm based on the accumulated data density per feature per class. In addition, the automatically generated model is easy to interpret and is locally generative and based on prototypes which define the modes of the data distribution. As a result, a highly efficient, lean, human-understandable, autonomously self-learning model (which only needs an extremely parsimonious priming) emerges from the data. To validate our proposal, we approbated it on four challenging problems, including imbalanced Faces-1999 data base, Caltech-101 dataset, vehicles dataset, and iRoads dataset, which is a dataset of images of autonomous driving scenarios. Not only we achieved higher precision (in one of the problems outperforming by 25% all other methods), but, more significantly, we only used a single class beforehand, while other methods used all the available classes and we generated interpretable models with smaller number of features used, through extremely weak and weak supervision. We demonstrated the ability to detect and learn new classes for both images and numerical examples.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.