Abstract

Deep learning excels in the identification of specific plant diseases. However, dealing with multi-domain datasets, which encompass a variety of categories, presents challenges due to limited data availability. (1) Background: In real-world scenarios, data distribution is uneven, the scale continues to grow, new categories emerge, and a phenomenon known as ‘catastrophic forgetting’ occurs. Models rely on a substantial amount of data for labeling and training. (2) Methods: We introduce a two-stage approach. The first stage is the scalable feature learning phase, where the previous feature representation is fixed. Through a new feature extractor, incoming and stored data are trained to expand features. In the second stage, by introducing an auxiliary loss to determine whether key parameters are retained, we reduce the instability of weight parameters. This maintains the separability of old features and encourages the model to learn new concepts, diversity, and discriminative features. (3) Results: Our findings indicate that when the data landscape shifts, recognition accuracy in multi-task continual learning, leveraging the simultaneous availability of datasets, significantly outperforms single convolutional networks and multi-task learning models. (4) Conclusions: Our method advances continual learning towards practical applications. It is particularly effective in mitigating catastrophic forgetting in multi-domain datasets and enhancing the robustness of deep-learning models.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.