Abstract

This paper proposes an improved Arithmetic Optimization Algorithm (AOA) to train artificial neural networks (ANNs) under dynamic environments. Despite many successful applications of metaheuristic training of ANNs, these studies assume static environments, which might not be realistic in real-world nonstationary processes. In this study, the training of ANNs is modeled as a dynamic optimization problem, and the proposed AOA is used to optimize connection weights and biases of the ANN under the presence of concept drift. The proposed method is designed to work for classification tasks. The performance of the proposed algorithm has been tested on twelve dynamic classification problems. Comparative analysis with state-of-the-art metaheuristic optimization algorithms has been provided. The superiority of the compared algorithms has been verified using nonparametric statistical tests. The results show that the improved AOA outperforms compared algorithms in training ANNs under dynamic environments. The findings demonstrate the potential of improved AOA for dynamic data-driven applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call