The model of a probabilistic neural network (PNN) is commonly utilized for classification and pattern recognition issues in data mining. An approach frequently used to enhance its effectiveness is the adjustment of PNN classifier parameters through the outcomes of metaheuristic optimization strategies. Since PNN employs a limited set of instructions, metaheuristic algorithms provide an efficient way to modify its parameters. In this study, we have employed the Aquila optimizer algorithm (AO), a contemporary algorithm, to modify PNN parameters. We have proposed two methods: Aquila optimizer based probabilistic neural network (AO-PNN), which uses both local and global search capabilities of AO, and hybrid Aquila optimizer and simulated annealing based probabilistic neural network (AOS-PNN), which integrates the global search abilities of AO with the local search mechanism of simulated annealing (SA). Our experimental results indicate that both AO-PNN and AOS-PNN perform better than the PNN model in terms of accuracy across all datasets. This suggests that they have the potential to generate more precise results when utilized to improve PNN parameters. Moreover, our hybridization technique, AOS-PNN, is more effective than AO-PNN, as evidenced by classification experiments accuracy, data distribution, convergence speed, and significance. We have also compared our suggested approaches with three different methodologies, namely Coronavirus herd immunity optimizer based probabilistic neural network (CHIO-PNN), African buffalo algorithm based probabilistic neural network (ABO-PNN), and β-hill climbing. We have found that AO-PNN and AOS-PNN have achieved significantly higher classification accuracy rates of 90.68 and 93.95, respectively.
Read full abstract