Documentation/Optimization

Aus OpenDino
Version vom 15. Februar 2013, 21:55 Uhr von Dirk (Diskussion | Beiträge) (The Objective Function)
Wechseln zu: Navigation, Suche

Optimization

Introduction

Optimization searches for the best solution x* to a problem and can be defined as

  x* = argmin (f(x))


where f is the function to be optimized (the objective function) and x are the design variables for which optimal settings have to be found.

This definition can be extended to multiple objectives f and by constraints.

Some Important Expressions

Manual vs. Automated Optimization

Optimization is an everyday task. For example, searching the fastest way to work or home is an optimization problem. We speak about automated optimization, if a computer algorithm solves the problem in an automated fashion, i.e. without user interaction.

Objective Function

Optimization typically searches for the optimal solution to a problem. For automated optimization, the problem must be encoded in a mathematical function f, which has to be either minimized or maximized. In OpenOpal, only minimization is considered as maximization can be expressed as:

   max(f) = - min(-f)

Design Variables

The optimal solution is searched by modifying the design variables x. The objective function depends on the design variables: f = f(x).

Multiple Objectives

If multiple objectives f should be optimized, then a multi-objective optimization problem has to be solved. Some optimization algorithms search concurrently for multiple compromise solutions for the objectives (Pareto optimization) or a single compromise solution, defined as a weighed sum of all objectives.

Constraints

While the optimization algorithm tries to minimize all objectives, constraints simply have to be fulfilled.

For example, the maximal stress in a truss should not exceed a certain limit. If the stress is below the limit, no advantage is gained. If the stress is above the limit, the solution is typically constrained by a penalty value, which increases with increasing constraint violation.

Direct vs. Indirect Search

In OpenOpal, we implement optimization algorithms that search a problem in an iterative fashion, i.e. by computing several different solutions to the problem. The best solution is returned. This iterative search can be either direct or indirect:

Direct search uses only direct information (i.e. the objective and constraint value(s). Indirect algorithms use indirect information (i.e. gradient and/or higher order derivative information of the objective(s) and constraint(s)).

Stochastic vs. Deterministic Algorithms

While stochastic algorithms such as Evolutionary Algorithms and Particle Swarm use random values in their search method, deterministic algorithms like the Simplex Method or gradient based search do not.