Abstract
This paper proposes an improved Arithmetic Optimization Algorithm (AOA) to train artificial neural networks (ANNs) under dynamic environments. Despite many successful applications of metaheuristic training of ANNs, these studies assume static environments, which might not be realistic in real-world nonstationary processes. In this study, the training of ANNs is modeled as a dynamic optimization problem, and the proposed AOA is used to optimize connection weights and biases of the ANN under the presence of concept drift. The proposed method is designed to work for classification tasks. The performance of the proposed algorithm has been tested on twelve dynamic classification problems. Comparative analysis with state-of-the-art metaheuristic optimization algorithms has been provided. The superiority of the compared algorithms has been verified using nonparametric statistical tests. The results show that the improved AOA outperforms compared algorithms in training ANNs under dynamic environments. The findings demonstrate the potential of improved AOA for dynamic data-driven applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.