This article presents a novel modification of the Harmony Search (HS) algorithm that is able to self-tune as the search progress. This adaptive behavior is independent of total iterations. Moreover, it requires less iterations and provides more precision than other variants of HS. Its effectiveness and performance was assessed, comparing our data against four well known and recent modifications of HS: IHS (Improved Harmony Search, 2007), ABHS (Adjustable Bandwidth Harmony Search, 2014), PAHS (Parameter Adaptive Harmony Search, 2014), and IGHS (Intelligent Global Harmony Search, 2014). Unlike other works, we did not analyze the data for a given number of iterations. Instead, we ran each algorithm until it achieved a given level of precision, and analyzed the number of iterations it required. Our test benchmark contained 30 standard test functions distributed like this: 11 unimodal, 8 multimodal with fixed dimensions, and 11 multimodal with variable dimensions. The latter also included a function whose optima was located at a different coordinate in each dimension. The search domain for each function was fixed according to the literature, though we also executed tests regarding the effect of varying it. We carried out a parameter sweep to find adequate values for each parameter of our proposed algorithm, analyzing 100 independent runs for 648 different combinations. Data confirm the implemented procedure outperformed the other variants for problems in 2D and in 5D. Scaling the test functions to 10D, 30D, and 50D reduced the convergence rate of the implemented procedure, but it still outperformed IHS, ABHS, and PAHS. In some cases (e.g. Schwefel function in 30D), the Self-regulated Fretwidth Harmony Search algorithm (SFHS) was found to be the fastest approach. It was also found that IGHS performs well for optimization problems whose optima is located at the same coordinates in all dimensions, but not as well in other scenarios. Our proposed algorithm (SFHS) is not hindered by this. SFHS was able to achieve full convergence for a test function with optima located at different coordinates, even in 50D and while exploring a search domain of [–1250, 1250]. SFHS is able to achieve a more precise solution than HS (several orders of smaller magnitude), and so it stands as a good improvement over HS, as well as over the tested variants (IHS, ABHS, PAHS, IGHS). Still, our proposed method exhibited some limitations, such as requiring more parameters than other variants, being unable to converge 100% of the times for all high dimensional functions, and sometimes needing several iterations to converge. In the first case, we think this allows for more freedom in the evolution of parameters. In the second case, we consider it can be addressed by replicating the self-tuning behavior to the remaining parameters. In the final aspect, we estimate that accelerating the evolution of SFHS could prove a very useful strategy. Regarding other optimization strategies, we ran tests in up to 30 dimensions, and compared our data against the Firefly algorithm. We found that our proposed method retains 100% convergence rate, while the convergence rate of Firefly drops drastically (in some cases), even yielding 0% at 30 dimensions.
Read full abstract