Abstract

Deep neural network (DNN) based on incremental learning provides support for efficient garbage classification tasks. However, it is always challenging to accurately learn and preserve the information of known classes for updating DNN while new tasks are continuously emerging, which also affects the generalization performance of the model. To solve these issues, an incremental evolution learning (IEL) method based on prototype enhancement is proposed to accurately preserve data and improve the model generalization ability. First, a prototype enhancement method based on multi-dimensional Gaussian kernel density estimation is designed, which extends the prototype of each sample based on high-dimensional nonlinear data distribution. Then, the prototype enhancement accurately represents the known class data. Second, a contrastive feature method is proposed to constrain the consistency of features between tasks, which reduces the deviation between different tasks. Then, the extraction preference caused by the class sample imbalance is balanced and the generalization ability is improved. Third, the proposed IEL is used for garbage classification with imbalanced class samples. IEL implements garbage classification by effectively adapting to the differences between known classes and new classes. Finally, experiments on four standard datasets and one public garbage dataset verify that IEL has a strong classification ability in the learning tasks with the increasing number of classes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call