Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Evolutionary algorithm
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Theoretical background == The following theoretical principles apply to all or almost all EAs. === No free lunch theorem === The [[no free lunch theorem]] of optimization states that all optimization strategies are equally effective when the set of all optimization problems is considered. Under the same condition, no evolutionary algorithm is fundamentally better than another. This can only be the case if the set of all problems is restricted. This is exactly what is inevitably done in practice. Therefore, to improve an EA, it must exploit problem knowledge in some form (e.g. by choosing a certain mutation strength or a [[Genetic representation|problem-adapted coding]]). Thus, if two EAs are compared, this constraint is implied. In addition, an EA can use problem specific knowledge by, for example, not randomly generating the entire start population, but creating some individuals through [[Heuristic (computer science)|heuristics]] or other procedures.<ref name="Davis91">{{Cite book |last=Davis |first=Lawrence |url=https://www.worldcat.org/oclc/23081440 |title=Handbook of genetic algorithms |date=1991 |publisher=Van Nostrand Reinhold |isbn=0-442-00173-8 |location=New York |oclc=23081440}}</ref><ref>{{Citation |last1=Lienig |first1=Jens |title=An evolutionary algorithm for the routing of multi-chip modules |date=1994 |url=http://link.springer.com/10.1007/3-540-58484-6_301 |work=Parallel Problem Solving from Nature — PPSN III |volume=866 |pages=588–597 |editor-last=Davidor |editor-first=Yuval |place=Berlin, Heidelberg |publisher=Springer |doi=10.1007/3-540-58484-6_301 |isbn=978-3-540-58484-1 |access-date=2022-10-18 |last2=Brandt |first2=Holger |editor2-last=Schwefel |editor2-first=Hans-Paul |editor3-last=Männer |editor3-first=Reinhard|url-access=subscription }}</ref> Another possibility to tailor an EA to a given problem domain is to involve suitable heuristics, [[Local search (optimization)|local search procedures]] or other problem-related procedures in the process of generating the offspring. This form of extension of an EA is also known as a [[memetic algorithm]]. Both extensions play a major role in practical applications, as they can speed up the search process and make it more robust.<ref name="Davis91" /><ref>{{Cite book |url=http://link.springer.com/10.1007/978-3-642-23247-3 |title=Handbook of Memetic Algorithms |date=2012 |publisher=Springer Berlin Heidelberg |isbn=978-3-642-23246-6 |editor-last=Neri |editor-first=Ferrante |series=Studies in Computational Intelligence |volume=379 |location=Berlin, Heidelberg |language=en |doi=10.1007/978-3-642-23247-3 |editor-last2=Cotta |editor-first2=Carlos |editor-last3=Moscato |editor-first3=Pablo}}</ref> === Convergence === For EAs in which, in addition to the offspring, at least the best individual of the parent generation is used to form the subsequent generation (so-called elitist EAs), there is a general proof of [[Convergence (logic)|convergence]] under the condition that an [[optimum]] exists. [[Without loss of generality]], a maximum search is assumed for the proof: From the property of elitist offspring acceptance and the existence of the optimum it follows that per generation <math>k</math> an improvement of the fitness <math>F</math> of the respective best individual <math>x'</math> will occur with a probability <math>P > 0</math>. Thus: :<math>F(x'_1) \leq F(x'_2) \leq F(x'_3) \leq \cdots \leq F(x'_k) \leq \cdots</math> I.e., the fitness values represent a [[Monotonic function|monotonically]] non-decreasing [[sequence]], which is [[bounded set|bounded]] due to the existence of the optimum. From this follows the convergence of the sequence against the optimum. Since the proof makes no statement about the speed of convergence, it is of little help in practical applications of EAs. But it does justify the recommendation to use elitist EAs. However, when using the usual [[Panmixia|panmictic]] [[Population model (evolutionary algorithm)|population model]], elitist EAs tend to [[Premature convergence|converge prematurely]] more than non-elitist ones.<ref>{{Cite journal |last1=Leung |first1=Yee |last2=Gao |first2=Yong |last3=Xu |first3=Zong-Ben |date=1997 |title=Degree of population diversity - a perspective on premature convergence in genetic algorithms and its Markov chain analysis |url=https://ieeexplore.ieee.org/document/623217 |journal=IEEE Transactions on Neural Networks |volume=8 |issue=5 |pages=1165–1176 |doi=10.1109/72.623217 |pmid=18255718 |issn=1045-9227|url-access=subscription }}</ref> In a panmictic population model, mate selection (see step 4 of the [[Evolutionary algorithm#Generic definition|generic definition]]) is such that every individual in the entire population is eligible as a mate. In [[Population model (evolutionary algorithm)|non-panmictic populations]], selection is suitably restricted, so that the dispersal speed of better individuals is reduced compared to panmictic ones. Thus, the general risk of premature convergence of elitist EAs can be significantly reduced by suitable population models that restrict mate selection.<ref>{{Citation |last=Gorges-Schleuter |first=Martina |title=A comparative study of global and local selection in evolution strategies |date=1998 |url=http://link.springer.com/10.1007/BFb0056879 |work=Parallel Problem Solving from Nature — PPSN V |series=Lecture Notes in Computer Science |volume=1498 |pages=367–377 |editor-last=Eiben |editor-first=Agoston E. |place=Berlin, Heidelberg |publisher=Springer Berlin Heidelberg |doi=10.1007/bfb0056879 |isbn=978-3-540-65078-2 |access-date=2022-10-21 |editor2-last=Bäck |editor2-first=Thomas |editor3-last=Schoenauer |editor3-first=Marc |editor4-last=Schwefel |editor4-first=Hans-Paul|url-access=subscription }}</ref><ref>{{Cite book |last1=Dorronsoro |first1=Bernabe |url=http://link.springer.com/10.1007/978-0-387-77610-1 |title=Cellular Genetic Algorithms |last2=Alba |first2=Enrique |date=2008 |publisher=Springer US |isbn=978-0-387-77609-5 |series=Operations Research/Computer Science Interfaces Series |volume=42 |location=Boston, MA |doi=10.1007/978-0-387-77610-1}}</ref> === Virtual alphabets === With the theory of virtual alphabets, [[David E. Goldberg]] showed in 1990 that by using a representation with real numbers, an EA that uses classical [[Crossover (genetic algorithm)|recombination operators]] (e.g. uniform or n-point crossover) cannot reach certain areas of the search space, in contrast to a coding with binary numbers.<ref>{{Citation |last=Goldberg |first=David E. |title=The theory of virtual alphabets |date=1990 |url=http://link.springer.com/10.1007/BFb0029726 |work=Parallel Problem Solving from Nature |series=Lecture Notes in Computer Science |volume=496 |pages=13–22 |publication-date=1991 |editor-last=Schwefel |editor-first=Hans-Paul |place=Berlin/Heidelberg |publisher=Springer-Verlag |language=en |doi=10.1007/bfb0029726 |isbn=978-3-540-54148-6 |access-date=2022-10-22 |editor2-last=Männer |editor2-first=Reinhard|url-access=subscription }}</ref> This results in the recommendation for EAs with real representation to use arithmetic operators for recombination (e.g. arithmetic mean or intermediate recombination). With suitable operators, real-valued representations are more effective than binary ones, contrary to earlier opinion.<ref>{{Cite book |last1=Stender |first1=J. |url=https://www.worldcat.org/oclc/47216370 |title=Genetic algorithms in optimisation, simulation, and modelling |last2=Hillebrand |first2=E. |last3=Kingdon |first3=J. |date=1994 |publisher=IOS Press |isbn=90-5199-180-0 |location=Amsterdam |oclc=47216370}}</ref><ref>{{Cite book |last=Michalewicz |first=Zbigniew |url=https://www.worldcat.org/oclc/851375253 |title=Genetic Algorithms + Data Structures = Evolution Programs |date=1996 |publisher=Springer |isbn=978-3-662-03315-9 |edition=3rd |publication-place=Berlin Heidelberg |oclc=851375253}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)