Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Multidisciplinary design optimization
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Problem solution== The problem is normally solved using appropriate techniques from the field of optimization. These include [[gradient]]-based algorithms, population-based algorithms, or others. Very simple problems can sometimes be expressed linearly; in that case the techniques of [[linear programming]] are applicable. ===Gradient-based methods=== *[[Adjoint equation]] *[[Newton's method]] *[[Steepest descent]] *[[Conjugate gradient]] *[[Sequential quadratic programming]] ===Gradient-free methods=== * Hooke-Jeeves pattern search * [[Nelder-Mead method]] ===Population-based methods=== *[[Genetic algorithm]] *[[Memetic algorithm]] *[[Particle swarm optimization]] *[[Harmony search]] *[[ODMA]] ===Other methods=== *Random search *[[Grid search]] *[[Simulated annealing]] *[[Brute-force search|Direct search]] *[[IOSO]] (Indirect Optimization based on Self-Organization) Most of these techniques require large numbers of evaluations of the objectives and the constraints. The disciplinary models are often very complex and can take significant amounts of time for a single evaluation. The solution can therefore be extremely time-consuming. Many of the optimization techniques are adaptable to [[parallel computing]]. Much current research is focused on methods of decreasing the required time. Also, no existing solution method is guaranteed to find the [[global optimization|global optimum]] of a general problem (see [[No free lunch in search and optimization]]). Gradient-based methods find local optima with high reliability but are normally unable to escape a local optimum. Stochastic methods, like simulated annealing and genetic algorithms, will find a good solution with high probability, but very little can be said about the mathematical properties of the solution. It is not guaranteed to even be a local optimum. These methods often find a different design each time they are run.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)