The mean performance of stochastic optimization algorithms may not average
The mean performance of stochastic optimization algorithms may not average
- Research Article
11
- 10.3390/math10224364
- Nov 20, 2022
- Mathematics
Reporting the empirical results of swarm and evolutionary computation algorithms is a challenging task with many possible difficulties. These difficulties stem from the stochastic nature of such algorithms, as well as their inability to guarantee an optimal solution in polynomial time. This research deals with measuring the performance of stochastic optimization algorithms, as well as the confidence intervals of the empirically obtained statistics. Traditionally, the arithmetic mean is used for measuring average performance, but we propose quantiles for measuring average, peak and bad-case performance, and give their interpretations in a relevant context for measuring the performance of the metaheuristics. In order to investigate the differences between arithmetic mean and quantiles, and to confirm possible benefits, we conducted experiments with 7 stochastic algorithms and 20 unconstrained continuous variable optimization problems. The experiments showed that median was a better measure of average performance than arithmetic mean, based on the observed solution quality. Out of 20 problem instances, a discrepancy between the arithmetic mean and median happened in 6 instances, out of which 5 were resolved in favor of median and 1 instance remained unresolved as a near tie. The arithmetic mean was completely inadequate for measuring average performance based on the observed number of function evaluations, while the 0.5 quantile (median) was suitable for that task. The quantiles also showed to be adequate for assessing peak performance and bad-case performance. In this paper, we also proposed a bootstrap method to calculate the confidence intervals of the probability of the empirically obtained quantiles. Considering the many advantages of using quantiles, including the ability to calculate probabilities of success in the case of multiple executions of the algorithm and the practically useful method of calculating confidence intervals, we recommend quantiles as the standard measure of peak, average and bad-case performance of stochastic optimization algorithms.
- Research Article
1
- 10.1021/acs.iecr.2c00547
- May 18, 2022
- Industrial & Engineering Chemistry Research
Density functional theory (DFT) is an efficient instrument for describing a wide range of nanoscale phenomena: wetting transition, capillary condensation, adsorption, etc. In this paper, we suggest a method for obtaining the equilibrium molecular fluid density in a nanopore using DFT without calculating the free-energy variation─Variation-Free Density Functional Theory (VF-DFT). This technique can be used to explore confined fluids with a complex type of interactions, additional constraints, and, to speed up calculations, which might be crucial in an inverse problem. The fluid density in VF-DFT approach is represented as a decomposition over a limited set of basis functions. We applied principal component analysis (PCA) to extract the basic patterns from the density function and take them into account in the construction of a set of basis functions. The decomposition coefficients of the fluid density by the basis were sought by stochastic optimization algorithms: genetic algorithm (GA) and particle swarm optimization (PSO), to minimize the free energy of the system. In this work, two different fluids were studied: nitrogen at a temperature of 77.4 K and argon 87.3 K, at a pore size of 3.6 nm, and the performance of optimization algorithms was compared. We also introduce the Hybrid Density Functional Theory (H-DFT) approach based on stochastic optimization methods and the classical Picard iteration method to find the equilibrium fluid density starting from the physically appropriate solution. The combination of Picard iteration and stochastic algorithms helps to significantly speed up the calculations of equilibrium density in the system without losing the quality of the solution, especially in cases with the high relative pressure and expressed layering structure.
- Research Article
- 10.12733/jics20103224
- May 1, 2014
- Journal of Information and Computational Science
In order to solve the shortcomings of traditional Particle Swarm Optimization (PSO) algorithm, this study is to propose a simultaneous perturbation stochastic approximation particle swarm optimization algorithm and its application in agricultural acreage evaluation. In this experiment, regression model is optimized by simultaneous perturbation stochastic approximation particle swarm optimization algorithm and traditional PSO algorithm respectively to show the superiority of simultaneous perturbation stochastic approximation particle swarm optimization algorithm to traditional particle swarm optimization algorithm in agricultural acreage evaluation. The experimental results show that the optimal performance of simultaneous perturbation stochastic approximation particle swarm optimization algorithm is better than that of traditional particle swarm optimization algorithm and application of simultaneous perturbation stochastic approximation particle swarm optimization algorithm in agricultural acreage evaluation is feasible.
- Conference Article
7
- 10.1109/icassp.2013.6639010
- May 1, 2013
Log-linear models find a wide range of applications in pattern recognition. The training of log-linear models is a convex optimization problem. In this work, we compare the performance of stochastic and batch optimization algorithms. Stochastic algorithms are fast on large data sets but can not be parallelized well. In our experiments on a broadcast conversations recognition task, stochastic methods yield competitive results after only a short training period, but when spending enough computational resources for parallelization, batch algorithms are competitive with stochastic algorithms. We obtained slight improvements by using a stochastic second order algorithm. Our best log-linear model outperforms the maximum likelihood trained Gaussian mixture model baseline although being ten times smaller.
- Research Article
3
- 10.14257/ijhit.2015.8.5.10
- May 31, 2015
- International Journal of Hybrid Information Technology
Test functions play an important role in validating and comparing the performance of optimization algorithms. The test functions should have some diverse properties, which can be useful in testing of any new algorithm. The efficiency, reliability and validation of optimization algorithms can be done by using a set of standard benchmarks or test functions. For any new optimization, it is necessary to validate its performance and compare it with other existing algorithms using a good set of test functions. Optimization problems are widely used in various fields of science and technology. Sometimes such problems can be very complex. Particle Swarm Optimization is a stochastic algorithm used for solving such optimization problems. This paper transplants some of the test functions which can be used to test the performance of Particle Swarm Optimization (PSO) algorithm, in order to improve its performance and have better results. Different test functions can be used for different types of problems. These test functions have a specific range and values, which can be applied in different situations. These functions, when applied to the PSO algorithm, can give the better comparison of results. The test functions that have been the most commonly adopted to assess performance of PSO-based algorithms and details of each of them are provided, such as the search range, the position of their known optima, and other relevant properties.
- Conference Article
2
- 10.2118/172588-ms
- Mar 8, 2015
This study presents an investigation of the performance of multiple stochastic optimization algorithms in performing automatic type-curve matching for pressure transient well test analysis. The pressure transient responses of a vertical well in a dual porosity reservoir were generated. A synthetic reservoir model that shows all flow regimes for the model was created. Gaussian White Noise data was added to the typical response to imitate measured data. In addition to the Levenberg-Marquardt algorithm, four stochastic algorithms were used to estimate the reservoir model from the noisy data. These algorithms are Deferential Evolution, Particle Swarm Optimization, Local Unimodal Sampling and Many Optimizing Liaisons. Behavioral parameters of each algorithm were investigated by comparing the performance of recommended values in the literature. Each algorithm was run for 25 realizations. The results of the runs were ordered in terms of the best achieved result. The performance was compared by comparing the best 1st, 7th, 19th and 25th results of each algorithm.The result showed that the algorithms performance is affected by the model and the unknowns. Differential Evolution algorithm showed the best performance in Dual Porosity Reservoir when Øm, λ, ω, skin, re & kf are the unknowns. All the other stochastic algorithms performed better than Levenberg-Marquardt optimization algorithm.
- Research Article
22
- 10.18178/ijmlc.2016.6.3.593
- Jun 1, 2016
- International Journal of Machine Learning and Computing
Reporting the results of optimization algorithms in evolutionary computation is a challenging task with many potential pitfalls. The source of problems is their stochastic nature and inability to guarantee an optimal solution in polynomial time. One of the basic questions that is often not addressed concerns the method of summarizing the entire distribution of solutions into a single value. Although the mean value is used by default for that purpose, the best solution obtained is also occasionally used in addition to or instead of it. Based on our analysis of different possibilities for measuring the performance of stochastic optimization algorithms presented in this paper we propose quantiles as the standard measure of performance. Quantiles can be naturally interpreted for the designated purpose. Besides, they are defined even when the arithmetic mean is not, and are applicable in cases of multiple executions of an algorithm. Our study also showed that, on the contrary to many other fields, in the case of stochastic optimization algorithms the greater variability in measured data can be considered as an advantage.
- Book Chapter
6
- 10.1007/978-3-319-72926-8_7
- Dec 21, 2017
In this paper, a study of how to compare the performance of multi-objective stochastic optimization algorithms using quality indicators and Deep Statistical Comparison (DSC) approach is presented. DSC is a recently proposed approach for statistical comparison of meta-heuristic stochastic optimization algorithms over single-objective problems. The main contribution of DSC is the ranking scheme that is based on the whole distribution, instead of using only one statistic such as average or median. Experimental results performed by using 6 multi-objective stochastic optimization algorithms on 16 test problems show that the DSC gives more robust results compared to some standard statistical approaches that are recommended for a comparison of multi-objective stochastic optimization algorithms according to some quality indicator.
- Research Article
2
- 10.1080/0305215x.2022.2127698
- Nov 23, 2022
- Engineering Optimization
The present work compares the performance of scientific law-inspired optimization algorithms for real-life constrained optimization applications. Ten such scientific law-inspired algorithms developed during the past decade are considered in this article. A constrained engineering application of the Stirling heat engine system is investigated with these algorithms. Four operating variables and two output constraints of the Stirling heat engine are considered for optimization. Comparative results are presented with statistical data to judge the performance of the algorithms and subsequently to identify the statistical significance and rank of each algorithm. The effects of various constraint handling methods on the performance of the algorithms are evaluated and presented. The behaviour of the constraint handling methods is analysed and presented. The effect of output constraints on the performance of the algorithms is also evaluated and presented. Finally, the convergence behaviour of the competitive algorithms is obtained and demonstrated.
- Conference Article
25
- 10.1109/cec.2009.4982973
- May 1, 2009
In this article we apply information visualization techniques to the domain of swarm intelligence. We describe an intuitive approach that enables researchers and designers of stochastic optimization algorithms to efficiently determine trends and identify optimal regions in an algorithm's parameter search space. The parameter space is evenly sampled using low-discrepancy sequences, and visualized using parallel coordinates. Various techniques are applied to iteratively highlight areas that influence the optimization algorithm's performance on a particular problem. By analyzing experimental data with this technique, we were able to gain important insight into the complexity of the target problem domain. For example, we were able to confirm some underlying theoretical assumptions of an important class of population-based stochastic algorithms. Most importantly, the technique improves the efficiency of finding good parameter settings by orders of magnitude.
- Research Article
35
- 10.1016/j.renene.2018.09.057
- Sep 18, 2018
- Renewable Energy
Ineffectiveness of optimization algorithms in building energy optimization and possible causes
- Research Article
14
- 10.1016/j.ejor.2015.06.044
- Jun 23, 2015
- European Journal of Operational Research
A multi-layer line search method to improve the initialization of optimization algorithms
- Research Article
11
- 10.1016/j.asoc.2016.07.033
- Jul 25, 2016
- Applied Soft Computing
Optimization techniques in respiratory control system models
- Preprint Article
1
- 10.5194/egusphere-egu2020-1792
- Mar 23, 2020
<p>The convergence performance of global optimization algorithms determines the reliability of the optimized parameter set of hydrological models, thereby affecting the prediction accuracy. This study applies advanced data analysis and visualization techniques to design a novel framework for characterizing and visualizing the convergence behavior of the optimization algorithms when used for the parameter calibration of hydrological models. First, we utilize violin plots to assess the convergence levels and speeds in individual parameter spaces (ECP-VP). The density distributions of violin plots match the possible properties of fitness landscapes. Then, the parallel coordinates techniques are used to simulate the dynamic convergence behavior and assess the convergence performance in multi-parameter space (ECP-PC). Furthermore, the possible mechanism for the effect of linear or nonlinear relationships between the parameters on the convergence performance is investigated using the maximal information coefficient (MIC) and the Pearson correlation coefficient (Pearson r). Finally, the effect of the parameter sensitivity on the convergence performance is analyzed. The proposed framework is applied in multi-period and multi-basin dynamic conditions as case studies. The results showed that the ECP-VP and ECP-PC techniques were well suited for the evaluation of the convergence performance of global optimization algorithms for hydrological models. The evaluation results provided valuable information on determining the reliability of the final optima, as well as the dominant response modes of hydrological models. It is also demonstrated that the convergence levels and speeds in pairwise parameter spaces depend on the linear correlations but not on the nonlinear correlation between the parameters. Additionally, there is no significant relationship between the sensitivity of the parameters and their convergence performance.</p>
- Research Article
35
- 10.1016/j.asoc.2019.105977
- Dec 5, 2019
- Applied Soft Computing
DSCTool: A web-service-based framework for statistical comparison of stochastic optimization algorithms
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.