Abstract
Incremental learning relies on the availability of ample training data for novel classes, a requirement that is often unfeasible in various application scenarios, particularly when new classes are rare groups that are pricey or challenging to attain. The main focus of incremental learning is on the tricky task of continuously learning to classify new classes in incoming data with no erasing knowledge of old classes. The research intends to develop a comparative analysis of optimization algorithms in training few-shot continual learning models to conquer catastrophic forgetting. The presented mechanism integrates various steps: pre-processing and classification. Images are initially pre-processed through contrast enhancement to elevate their quality. Pre-processed outputs are then classified by employing Continually Evolved Classifiers, generated to address a matter of catastrophic forgetting. Furthermore, to further enhance performance, Serial Exponential Sand Cat Swarm optimization algorithm (SE-SCSO) is employed and compared against ten other algorithms, containing Grey Wolf Optimization (GWO) algorithm, Moth flame optimization (MFO), cuckoo Search Optimization Algorithm (CSOA), Elephant Search Algorithm (ESA), Whale Optimization Algorithm (WOA), Artificial Algae Algorithm (AAA), Cat Swarm Optimization (CSO), Fish Swarm Algorithm (FSA), Genetic Bee Colony (GBC) Algorithm, and Particle swarm optimization (PSO). From the experiment results, SE-SCSO had attained the maximum performance with an accuracy of 89.6%, specificity of 86%, precision of 83%, recall of 92.3% and f-measure of 87.4%.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have