Abstract

We present MEMPSODE, a global optimization software tool that integrates two prominent population-based stochastic algorithms, namely Particle Swarm Optimization and Differential Evolution, with well established efficient local search procedures made available via the Merlin optimization environment. The resulting hybrid algorithms, also referred to as Memetic Algorithms, combine the space exploration advantage of their global part with the efficiency asset of the local search, and as expected they have displayed a highly efficient behavior in solving diverse optimization problems. The proposed software is carefully parametrized so as to offer complete control to fully exploit the algorithmic virtues. It is accompanied by comprehensive examples and a large set of widely used test functions, including tough atomic cluster and protein conformation problems. Program summaryProgram title: MEMPSODE (MEMetic Particle Swarm Optimization and Differential Evolution)Catalogue identifier: AELM_v1_0Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELM_v1_0.htmlProgram obtainable from: CPC Program Library, Queenʼs University, Belfast, N. IrelandLicensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.htmlNo. of lines in distributed program, including test data, etc.: 14 877No. of bytes in distributed program, including test data, etc.: 592 244Distribution format: tar.gzProgramming language: ANSI C, ANSI Fortran-77Computer: WorkstationsOperating system: Developed under the Linux operating system using the GNU compilers v.4.4.3. It has also been tested under Solaris and the Cygwin environment.RAM: The code uses O(n×N) internal storage, n being the dimension of the problem and N the maximum population size. The required memory is dynamically allocated.Word size: 64 bitsClassification: 4.9Subprograms used:Cat IdTitleReferenceAAXW_v4_0 MERLIN-3.1.1 CPC 159 (2004) 70Nature of problem: Optimization is a valuable mathematical tool for solving a plethora of scientific and engineering problems. Usually, the underlying problems are modeled with objective functions whose minimizers (or maximizers) correspond to the desired solutions of the original problem. In many cases, there is a multitude of such minimizers that correspond to solutions either locally, i.e., in their close neighborhood, or globally, i.e., with respect to the whole search space. There is a significant number of efficient algorithms for addressing optimization problems. One can distinguish two main categories, based on their adequacy in performing better global (exploration) or local (exploitation) search. Standard local optimization algorithms have the ability to rapidly converge towards local minimizers but they are also prone to get easily trapped in their vicinity. These algorithms usually exploit local information of the objective function, including first- and second-order derivatives. On the other hand, global optimization algorithms are designed to perform better exploration, although at the cost of questionable convergence properties. Typically, these approaches integrate stochastic operations. The form of the optimization problem at hand plays a crucial role in the selection of the most appropriate algorithm. Objective functions that lack nice mathematical properties (such as differentiability, continuity etc.) may raise applicability issues for algorithms that require derivatives. On the other hand, applications that require high accuracy may be laborious for stochastic algorithms. The existence of a multitude of local and/or global minimizers can render these problems even harder for any single optimization algorithm.Solution method: Evolutionary Algorithms and Swarm Intelligence approaches have been established as effective global optimization algorithms that make minor assumptions about the objective function. Particle Swarm Optimization (PSO) and Differential Evolution (DE) possess a salient position among the most successful algorithms of these categories. Numerous studies indicate that their performance can be radically improved when combined with efficient local optimization schemes. The resulting hybrid algorithms offer more balanced search intensification/diversification than the original ones, thereby increasing both their efficiency and effectiveness. Such hybrid schemes are called Memetic Algorithms, and they have gained a rapidly growing interest over the past few years.We present MEMPSODE (MEMetic, PSO and DE), a global optimization software that implements memetic PSO and DE within a unified framework. The software utilizes local search procedures from the established Merlin optimization environment. The performance of the implemented approaches is illustrated on several examples, including hard optimization tasks such as atomic cluster and protein conformation problems.Restrictions: The current version of the software uses double precision arithmetic. However, it can be easily adapted by the user to handle integer or mixed-integer problems.Unusual features: The software takes into account only bound constraints. General constraints may be tackled by user-defined penalty or barrier functions that can be easily incorporated in the source code of the objective function.Additional comments: The use of the Merlin Optimization Environment 3.1.1 (see subprograms above) is optional.A comprehensive user manual is provided that covers in detail the installation procedure and provides detailed examples of operation.Running time: The running time depends solely on the complexity of the objective function (and its derivatives, if used) as well as on the available computational budget (number of function evaluations). The test run provided (Rastrigin function n=10), requires 2.5×106 function evaluations (2.8 seconds on an i7-920 CPU).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call