Abstract

The Fish School Search (FSS) algorithm is a heuristic technique for finding globally optimal solutions. This algorithm is characterized by its simplicity in implementation, and high performance. Since the first mention of FSS, this effective optimization algorithm has been of a great interest among researches and practitioners around the globe. Modifications of FSS exist, applied to solve practical problems, including image reconstruction in electrical impedance tomography, finding optimal solutions in assembly line balancing problems, neural network structure optimization. In this paper, we consider a modification of the FSS algorithm, which uses chaos theory to generate uniformly distributed pseudorandom numbers, and incorporates exponential step decay. The described modified optimization algorithm is known as ETFSS, and is characterized by faster convergence speed and better performance. In order to further investigate the performance of the novel optimization algorithm, we apply ETFSS to neural network loss function optimization. In addition, we compare the described approach with other machine learning techniques, such as the support vector machine (SVM) algorithm, k-nearest neighbors (KNN) algorithm and back propagation-based neural network, trained using the adaptive moment estimation (Adam) optimizer. We visualize classification results using T-distributed stochastic neighbor embedding (TSNE) method, and uniform manifold approximation and projection (UMAP) method, in order to provide more details considering classification performance and dataset shape. The obtained results confirm, that ETFSS can produce slightly more accurate classifications when compared to backpropagation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call