Abstract

In this paper, we focus on an important nature inspired optimization method known as Firefly Algorithm (FA). Weight initialization of neural network is a random matrix between certain interval; FA is used for weight initialization instead of random weights to Back-Propagation Neural Network (BPN). This work emphasis on approximation of nonlinear discrete-time system using FA-based BPN, i.e., FABPN. Neural network behaves as granular computing methodologies, which depends on some important parameters including initial weight matrix. FA is gradient-free optimization method used for balanced premature convergence and stagnation. We consider Mean Square Error (MSE) and Mean Absolute Percentage Error (MAPE) as objective functions. The proposed method is based on characteristics of fireflies, i.e., updating population, moving fireflies, and best solution. The performance of FA-based Back-Propagation Network (FABPN) has been demonstrated using two nonlinear discrete-time systems and one real example. Results of the FABPN envisage that the proposed method exhibits sky-scraping identification accuracy and tracking error in comparison with Particle Swarm Optimization-based BPN (PSOBPN). Tracking error using proposed method is lies on compact set in the closed neighborhood of zero.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call