Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Genetic algorithm
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Related techniques == {{See also|List of genetic algorithm applications}} ===Parent fields=== Genetic algorithms are a sub-field: *[[Evolutionary algorithms]] *[[Evolutionary computing]] *[[Metaheuristic]]s *[[Stochastic optimization]] *[[Optimization (mathematics)|Optimization]] ===Related fields=== ====Evolutionary algorithms==== {{More citations needed section|date=May 2011}} {{main|Evolutionary algorithm}} Evolutionary algorithms is a sub-field of [[Evolutionary Computation|evolutionary computing]]. * [[Evolution strategy|Evolution strategies]] (ES, see Rechenberg, 1994) evolve individuals by means of mutation and intermediate or discrete recombination. ES algorithms are designed particularly to solve problems in the real-value domain.<ref>{{cite book|last=Cohoon|first=J|display-authors=etal|title=Evolutionary algorithms for the physical design of VLSI circuits|url= https://www.ifte.de/mitarbeiter/lienig/cohoon.pdf |archive-url=https://ghostarchive.org/archive/20221009/https://www.ifte.de/mitarbeiter/lienig/cohoon.pdf |archive-date=2022-10-09 |url-status=live|journal=Advances in Evolutionary Computing: Theory and Applications|publisher= Springer, pp. 683-712, 2003|isbn=978-3-540-43330-9|year=2002}}</ref> They use self-adaptation to adjust control parameters of the search. De-randomization of self-adaptation has led to the contemporary Covariance Matrix Adaptation Evolution Strategy ([[CMA-ES]]). * [[Evolutionary programming]] (EP) involves populations of solutions with primarily mutation and selection and arbitrary representations. They use self-adaptation to adjust parameters, and can include other variation operations such as combining information from multiple parents. * [[Estimation of Distribution Algorithm]] (EDA) substitutes traditional reproduction operators by model-guided operators. Such models are learned from the population by employing machine learning techniques and represented as Probabilistic Graphical Models, from which new solutions can be sampled<ref>{{cite book|last1=Pelikan|first1=Martin|last2=Goldberg|first2=David E.|last3=Cantú-Paz|first3=Erick|title=BOA: The Bayesian Optimization Algorithm|journal=Proceedings of the 1st Annual Conference on Genetic and Evolutionary Computation - Volume 1|date=1 January 1999|pages=525–532|url=http://dl.acm.org/citation.cfm?id=2933973|isbn=9781558606111|series=Gecco'99}}</ref><ref>{{cite book|last1=Pelikan|first1=Martin|title=Hierarchical Bayesian optimization algorithm : toward a new generation of evolutionary algorithms|date=2005|publisher=Springer|location=Berlin [u.a.]|isbn=978-3-540-23774-7|edition=1st}}</ref> or generated from guided-crossover.<ref>{{cite book|last1=Thierens|first1=Dirk|title=Parallel Problem Solving from Nature, PPSN XI |chapter=The Linkage Tree Genetic Algorithm|date=11 September 2010|pages=264–273|doi=10.1007/978-3-642-15844-5_27|language=en|isbn=978-3-642-15843-8}}</ref> * [[Genetic programming]] (GP) is a related technique popularized by [[John Koza]] in which computer programs, rather than function parameters, are optimized. Genetic programming often uses [[Tree (data structure)|tree-based]] internal [[data structure]]s to represent the computer programs for adaptation instead of the [[List (computing)|list]] structures typical of genetic algorithms. There are many variants of Genetic Programming, including [[Cartesian genetic programming]], [[Gene expression programming]],<ref>{{cite journal|last=Ferreira|first=C|title=Gene Expression Programming: A New Adaptive Algorithm for Solving Problems|url= http://www.gene-expression-programming.com/webpapers/GEP.pdf |archive-url=https://ghostarchive.org/archive/20221009/http://www.gene-expression-programming.com/webpapers/GEP.pdf |archive-date=2022-10-09 |url-status=live|journal=Complex Systems |year=2001|volume=13 |issue=2 |pages=87–129|arxiv=cs/0102027|bibcode=2001cs........2027F}}</ref> [[grammatical evolution]], [[Linear genetic programming]], [[Multi expression programming]] etc. * [[Grouping genetic algorithm]] (GGA) is an evolution of the GA where the focus is shifted from individual items, like in classical GAs, to groups or subset of items.<ref name="Falkenauer">{{cite book|last=Falkenauer|first=Emanuel|author-link=Emanuel Falkenauer|year=1997|title=Genetic Algorithms and Grouping Problems|publisher=John Wiley & Sons Ltd|location=Chichester, England|isbn=978-0-471-97150-4}}</ref> The idea behind this GA evolution proposed by [[Emanuel Falkenauer]] is that solving some complex problems, a.k.a. ''clustering'' or ''partitioning'' problems where a set of items must be split into disjoint group of items in an optimal way, would better be achieved by making characteristics of the groups of items equivalent to genes. These kind of problems include [[bin packing problem|bin packing]], line balancing, [[cluster analysis|clustering]] with respect to a distance measure, equal piles, etc., on which classic GAs proved to perform poorly. Making genes equivalent to groups implies chromosomes that are in general of variable length, and special genetic operators that manipulate whole groups of items. For bin packing in particular, a GGA hybridized with the Dominance Criterion of Martello and Toth, is arguably the best technique to date. * [[Interactive evolutionary algorithm]]s are evolutionary algorithms that use human evaluation. They are usually applied to domains where it is hard to design a computational fitness function, for example, evolving images, music, artistic designs and forms to fit users' aesthetic preference. ====Swarm intelligence==== {{main|Swarm intelligence}} Swarm intelligence is a sub-field of [[Evolutionary Computation|evolutionary computing]]. * [[Ant colony optimization]] ('''ACO''') uses many ants (or agents) equipped with a pheromone model to traverse the solution space and find locally productive areas. *Although considered an [[Estimation of distribution algorithm]],<ref>{{cite journal|last1=Zlochin|first1=Mark|last2=Birattari|first2=Mauro|last3=Meuleau|first3=Nicolas|last4=Dorigo|first4=Marco|title=Model-Based Search for Combinatorial Optimization: A Critical Survey|journal=Annals of Operations Research|date=1 October 2004|volume=131|issue=1–4|pages=373–395|doi=10.1023/B:ANOR.0000039526.52305.af|language=en|issn=0254-5330|citeseerx=10.1.1.3.427|s2cid=63137}}</ref> [[Particle swarm optimization]] (PSO) is a computational method for multi-parameter optimization which also uses population-based approach. A population (swarm) of candidate solutions (particles) moves in the search space, and the movement of the particles is influenced both by their own best known position and swarm's global best known position. Like genetic algorithms, the PSO method depends on information sharing among population members. In some problems the PSO is often more computationally efficient than the GAs, especially in unconstrained problems with continuous variables.<ref>Rania Hassan, Babak Cohanim, Olivier de Weck, Gerhard Vente r (2005) [https://www.mit.edu/~deweck/PDF_archive/3%20Refereed%20Conference/3_50_AIAA-2005-1897.pdf A comparison of particle swarm optimization and the genetic algorithm]</ref> ====Other evolutionary computing algorithms==== Evolutionary computation is a sub-field of the [[metaheuristic]] methods. * [[Memetic algorithm]] (MA), often called ''hybrid genetic algorithm'' among others, is a population-based method in which solutions are also subject to local improvement phases. The idea of memetic algorithms comes from [[meme]]s, which unlike genes, can adapt themselves. In some problem areas they are shown to be more efficient than traditional evolutionary algorithms. * [[Bacteriologic algorithm]]s (BA) inspired by [[evolutionary ecology]] and, more particularly, bacteriologic adaptation. Evolutionary ecology is the study of living organisms in the context of their environment, with the aim of discovering how they adapt. Its basic concept is that in a heterogeneous environment, there is not one individual that fits the whole environment. So, one needs to reason at the population level. It is also believed BAs could be successfully applied to complex positioning problems (antennas for cell phones, urban planning, and so on) or data mining.<ref>{{cite journal|url=http://www.irisa.fr/triskell/publis/2005/Baudry05d.pdf |archive-url=https://ghostarchive.org/archive/20221009/http://www.irisa.fr/triskell/publis/2005/Baudry05d.pdf |archive-date=2022-10-09 |url-status=live|first=Benoit|last=Baudry |author2=Franck Fleurey |author3-link=Jean-Marc Jézéquel|author3=Jean-Marc Jézéquel |author4=Yves Le Traon|title=Automatic Test Case Optimization: A Bacteriologic Algorithm|date=March–April 2005|pages=76–82|journal=IEEE Software|issue=2|doi=10.1109/MS.2005.30|volume=22|s2cid=3559602|access-date=9 August 2009}}</ref> * [[Cultural algorithm]] (CA) consists of the population component almost identical to that of the genetic algorithm and, in addition, a knowledge component called the belief space. * [[Differential evolution]] (DE) inspired by migration of superorganisms.<ref>{{cite journal|last=Civicioglu|first=P.|title=Transforming Geocentric Cartesian Coordinates to Geodetic Coordinates by Using Differential Search Algorithm|journal=Computers &Geosciences|year=2012|volume=46|pages=229–247|doi=10.1016/j.cageo.2011.12.011|bibcode=2012CG.....46..229C}}</ref> * [[Gaussian adaptation]] (normal or natural adaptation, abbreviated NA to avoid confusion with GA) is intended for the maximisation of manufacturing yield of signal processing systems. It may also be used for ordinary parametric optimisation. It relies on a certain theorem valid for all regions of acceptability and all Gaussian distributions. The efficiency of NA relies on information theory and a certain theorem of efficiency. Its efficiency is defined as information divided by the work needed to get the information.<ref>{{cite journal|last=Kjellström|first=G.|title= On the Efficiency of Gaussian Adaptation|journal=Journal of Optimization Theory and Applications|volume=71|issue=3|pages=589–597|date=December 1991|doi= 10.1007/BF00941405|s2cid=116847975}}</ref> Because NA maximises mean fitness rather than the fitness of the individual, the landscape is smoothed such that valleys between peaks may disappear. Therefore it has a certain "ambition" to avoid local peaks in the fitness landscape. NA is also good at climbing sharp crests by adaptation of the moment matrix, because NA may maximise the disorder ([[average information]]) of the Gaussian simultaneously keeping the [[mean fitness]] constant. ====Other metaheuristic methods==== Metaheuristic methods broadly fall within [[Stochastic optimization|stochastic]] optimisation methods. * [[Simulated annealing]] (SA) is a related global optimization technique that traverses the search space by testing random mutations on an individual solution. A mutation that increases fitness is always accepted. A mutation that lowers fitness is accepted probabilistically based on the difference in fitness and a decreasing temperature parameter. In SA parlance, one speaks of seeking the lowest energy instead of the maximum fitness. SA can also be used within a standard GA algorithm by starting with a relatively high rate of mutation and decreasing it over time along a given schedule. * [[Tabu search]] (TS) is similar to simulated annealing in that both traverse the solution space by testing mutations of an individual solution. While simulated annealing generates only one mutated solution, tabu search generates many mutated solutions and moves to the solution with the lowest energy of those generated. In order to prevent cycling and encourage greater movement through the solution space, a tabu list is maintained of partial or complete solutions. It is forbidden to move to a solution that contains elements of the tabu list, which is updated as the solution traverses the solution space. * [[Extremal optimization]] (EO) Unlike GAs, which work with a population of candidate solutions, EO evolves a single solution and makes [[local search (optimization)|local]] modifications to the worst components. This requires that a suitable representation be selected which permits individual solution components to be assigned a quality measure ("fitness"). The governing principle behind this algorithm is that of ''emergent'' improvement through selectively removing low-quality components and replacing them with a randomly selected component. This is decidedly at odds with a GA that selects good solutions in an attempt to make better solutions. ====Other stochastic optimisation methods==== * The [[Cross-entropy method|cross-entropy (CE) method]] generates candidate solutions via a parameterized probability distribution. The parameters are updated via cross-entropy minimization, so as to generate better samples in the next iteration. * Reactive search optimization (RSO) advocates the integration of sub-symbolic machine learning techniques into search heuristics for solving complex optimization problems. The word reactive hints at a ready response to events during the search through an internal online feedback loop for the self-tuning of critical parameters. Methodologies of interest for Reactive Search include machine learning and statistics, in particular [[reinforcement learning]], [[Active learning (machine learning)|active or query learning]], [[Artificial neural network|neural networks]], and [[metaheuristics]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)