Documentation/Modules: Unterschied zwischen den Versionen

Aus OpenDino
Wechseln zu: Navigation, Suche
 
(Eine dazwischenliegende Version von einem anderen Benutzer wird nicht angezeigt)
Zeile 13: Zeile 13:
 
These algorithms neither use gradient information nor stochastic processes.  
 
These algorithms neither use gradient information nor stochastic processes.  
  
* [[Documentation/Modules/OptAlgSIMPLEX | <code>OptAlgSIMPLEX</code>]]: Nelder-Mead Simplex Algorithm
+
* [[Documentation/Modules/OptAlgSimplex | <code>OptAlgSimplex</code>]]: Nelder-Mead Simplex Algorithm
  
 
=== Indirect, Stochastic Algorithms ===
 
=== Indirect, Stochastic Algorithms ===
Zeile 43: Zeile 43:
 
* [[Documentation/Modules/ContinuousTestProblems | <code>ContinuousTestProblems </code>]]: A Set of Single Objective Test Problem
 
* [[Documentation/Modules/ContinuousTestProblems | <code>ContinuousTestProblems </code>]]: A Set of Single Objective Test Problem
 
* [[Documentation/Modules/ContinuousMOTestProblems | <code>ContinuousMOTestProblems </code>]]: A Set of Multi-Objective Test Problem
 
* [[Documentation/Modules/ContinuousMOTestProblems | <code>ContinuousMOTestProblems </code>]]: A Set of Multi-Objective Test Problem
 +
 +
* [[Documentation/Modules/ProblemTruss | <code>ProblemTruss </code>]]: The goal is to opzimize the thickness of 10 trusses. The weight of the truess, maximum stress, and displacement can each be set either as objective or constraint.
  
 
== Machine Learning ==
 
== Machine Learning ==

Aktuelle Version vom 24. April 2019, 22:04 Uhr

Modules contain all the functionality for optimizing and learning. One Modules may contain an optimization algorithm, an artificial neural network, or a problem to optimize.

Here is a list of documented modules in OpenDino. Further modules may exist, but may not yet be documented.

Single Objective Optimization Algorithms

Indirect, Deterministic Algorithms

Indirect algorithms use gradient or higher order derivative information in the optimization.

Not implemented, yet.

Direct, Deterministic Algorithms

These algorithms neither use gradient information nor stochastic processes.

Indirect, Stochastic Algorithms

These algorithms do not use gradient information but require stochastic processes (i.e. random numbers) in their search.

  • Evolutionary Algorithms
    • OptAlgOpO: 1+1 Evolution Strategy with 1/5 Success Rule: the (1+1)-ES
    • OptAlgCMA: A Multi-member Evolution Strategy with Covariance Matrix Adaptation: the CMA-ES

Single and Multi-Objective Optimization Algorithms

Indirect, Stochastic Algorithms

  • Evolutionary Algorithms
    • OptAlgMoCMA: Elitist Evolution Strategy with Covariance Matrix Adaptation
  • Particle Methods

Design of Experiments

Optimization in General

Optimization Problems

  • ProblemTruss : The goal is to opzimize the thickness of 10 trusses. The weight of the truess, maximum stress, and displacement can each be set either as objective or constraint.

Machine Learning

Miscellaneous Modules