Documentation/Modules/OptAlgOpO

Aus OpenDino
Wechseln zu: Navigation, Suche

Summary

The 1+1 Evolution Strategy with 1/5 Success Rule ((1+1)-ES) is one of the simplest Evolution Strategies and has been proposed by Rechenberg (1). The algorithm has a population of two individuals: one parent xp and one child xc. The parent generates the child. The objective function value of the child f(xc) is computed and compared to its parent f(xp). If the child has a better fitness (lower objective function value), the child becomes the new parent, otherwise the parent stays.

This algorithm is designed for continuous design variables and has limited performance for discrete design variables. Furthermore, the algorithm is limited to the minimization of a single (objective) function.

Properties

General

Algorithm stochastic
Design Variables Designed for continuous variables. Limited capabilities for discrete and mixed design variables, preferably the number of discrete values is high (i.e. no binary problems!). Limited performance for highly mis-scaled and/or strongly correlated objective functions.
Objectives single-objective for minimization.
Constraint handling no
Boundary handling no
Initialization Requires at least one of the following: initial solution, initial search region, or bounds.

Connections

Starting at this module Module requires exactly one connection of type optimization.
Ending at this module

Actions

Name Description
Run starts the optimization.

Options

The options are currently described as "pop-up help".

Introduction

The (1+1)-ES has a population of two individuals, one parent and one child. Two natural inspired operators are considered, which are mutation and selection, by which the strategy iteratively approaches the optimum to a problem. Here we consider the minimization of a function f(x) with a vector of design variables x.

1. Initialization

The algorithm starts with a single parent xp, whose fitness f(xp) is computed.

2. Creating a Child

A child xc is created by mutating the design variables of the parent:

xc = xp + z,

where z is a random vector, with each component zi taken from a normal distribution (i.e. a Gaussian distribution):

zi ~ N(0,S2),

with zero mean and standard deviation S. The step size S controls the strength of the mutation and is adapted while optimizing the problem.

3. Selecting the Next Parent

In the selection step, the fitness of the child and parent is compared and the individual with the lower fitness becomes the parent for the next generation and the other individual is deleted.

4. Step Size Adaptation

The step size S controls the strength of the mutation. For the normally distributed mutation, S describes the radius of a circle around the parent, characterized that on average 62.5% of the children are within this circle. The step size should be adapted in order to take into account the convergence of the algorithm towards the minimum. In the beginning of an optimization, the parent may be far from the optimum and large step sizes are preferred. The step size should then decrease while approaching to the minimum.

A relationship between ideal step size S and the fraction of improved children compared to their parent is given in the 1/5 success rule:

Be N the number of design variables of the optimization problem, then after every N mutations, check how many successes have occurred over the preceding 10N mutations. If this number is less than 2N, multiply the step size by the factor 0.85; divide the step size by 0.85 if more than 2N successes occurred.

Please note that one mutation means the creation of one child.

5. Decision

Repeat with Point 2, if no termination criterion is violated.

Usage

... todo

Source Code

ToDo:Link to SVN

References

1. Rechenberg, I.: Evolutionsstrategie, Friedrich Frommann Verlag, 1973