Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Genetic algorithm
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Limitations == {{more citations needed section|date=March 2024}} The practical use of a genetic algorithm has limitations, especially as compared to alternative optimization algorithms: * Repeated [[fitness function]] evaluation for complex problems is often the most prohibitive and limiting segment of artificial evolutionary algorithms. Finding the optimal solution to complex high-dimensional, multimodal problems often requires very expensive [[fitness function]] evaluations. In real world problems such as structural optimization problems, a single function evaluation may require several hours to several days of complete simulation. Typical optimization methods cannot deal with such types of problem. In this case, it may be necessary to forgo an exact evaluation and use an [[fitness approximation|approximated fitness]] that is computationally efficient. It is apparent that amalgamation of [[fitness approximation|approximate models]] may be one of the most promising approaches to convincingly use GA to solve complex real life problems.{{citation needed|date=March 2024}} * Genetic algorithms do not scale well with complexity. That is, where the number of elements which are exposed to mutation is large there is often an exponential increase in search space size. This makes it extremely difficult to use the technique on problems such as designing an engine, a house or a plane {{Citation needed|date=December 2020}}. In order to make such problems tractable to evolutionary search, they must be broken down into the simplest representation possible. Hence we typically see evolutionary algorithms encoding designs for fan blades instead of engines, building shapes instead of detailed construction plans, and airfoils instead of whole aircraft designs. The second problem of complexity is the issue of how to protect parts that have evolved to represent good solutions from further destructive mutation, particularly when their fitness assessment requires them to combine well with other parts.{{citation needed|date=March 2024}} * The "better" solution is only in comparison to other solutions. As a result, the stopping criterion is not clear in every problem.{{citation needed|date=March 2024}} * In many problems, GAs have a tendency to converge towards [[local optimum|local optima]] or even arbitrary points rather than the [[global optimum]] of the problem. This means that it does not "know how" to sacrifice short-term fitness to gain longer-term fitness. The likelihood of this occurring depends on the shape of the [[fitness landscape]]: certain problems may provide an easy ascent towards a global optimum, others may make it easier for the function to find the local optima. This problem may be alleviated by using a different fitness function, increasing the rate of mutation, or by using selection techniques that maintain a diverse population of solutions,<ref>{{cite journal|last1=Taherdangkoo|first1=Mohammad|last2=Paziresh |first2=Mahsa |last3=Yazdi |first3=Mehran |last4= Bagheri |first4=Mohammad Hadi |title=An efficient algorithm for function optimization: modified stem cells algorithm|journal=Central European Journal of Engineering|date=19 November 2012|volume=3|issue=1|pages=36β50|doi=10.2478/s13531-012-0047-8|doi-access=free}}</ref> although the [[No free lunch in search and optimization|No Free Lunch theorem]]<ref>Wolpert, D.H., Macready, W.G., 1995. No Free Lunch Theorems for Optimisation. Santa Fe Institute, SFI-TR-05-010, Santa Fe.</ref> proves that there is no general solution to this problem. A common technique to maintain diversity is to impose a "niche penalty", wherein, any group of individuals of sufficient similarity (niche radius) have a penalty added, which will reduce the representation of that group in subsequent generations, permitting other (less similar) individuals to be maintained in the population. This trick, however, may not be effective, depending on the landscape of the problem. Another possible technique would be to simply replace part of the population with randomly generated individuals, when most of the population is too similar to each other. Diversity is important in genetic algorithms (and [[genetic programming]]) because crossing over a homogeneous population does not yield new solutions. In [[Evolution strategy|evolution strategies]] and [[evolutionary programming]], diversity is not essential because of a greater reliance on mutation.{{citation needed|date=March 2024}} * Operating on dynamic data sets is difficult, as genomes begin to converge early on towards solutions which may no longer be valid for later data. Several methods have been proposed to remedy this by increasing genetic diversity somehow and preventing early convergence, either by increasing the probability of mutation when the solution quality drops (called ''triggered hypermutation''), or by occasionally introducing entirely new, randomly generated elements into the gene pool (called ''random immigrants''). Again, [[Evolution strategy|evolution strategies]] and [[evolutionary programming]] can be implemented with a so-called "comma strategy" in which parents are not maintained and new parents are selected only from offspring. This can be more effective on dynamic problems.{{citation needed|date=March 2024}} * GAs cannot effectively solve problems in which the only fitness measure is a binary pass/fail outcome (like [[decision problem]]s), as there is no way to converge on the solution (no hill to climb). In these cases, a random search may find a solution as quickly as a GA. However, if the situation allows the success/failure trial to be repeated giving (possibly) different results, then the ratio of successes to failures provides a suitable fitness measure.{{citation needed|date=March 2024}} * For specific optimization problems and problem instances, other optimization algorithms may be more efficient than genetic algorithms in terms of speed of convergence. Alternative and complementary algorithms include [[Evolution strategy|evolution strategies]], [[evolutionary programming]], [[simulated annealing]], [[Gaussian adaptation]], [[hill climbing]], and [[swarm intelligence]] (e.g.: [[ant colony optimization]], [[particle swarm optimization]]) and methods based on [[integer linear programming]]. The suitability of genetic algorithms is dependent on the amount of knowledge of the problem; well known problems often have better, more specialized approaches.{{citation needed|date=March 2024}}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)