Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Metaheuristic
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Optimization technique}} {{Use American English|date=January 2019}} In [[computer science]] and [[mathematical optimization]], a '''metaheuristic''' is a higher-level [[procedure (computer science)|procedure]] or [[Heuristic (computer science)|heuristic]] designed to find, generate, tune, or select a heuristic (partial [[search algorithm]]) that may provide a sufficiently good solution to an [[optimization problem]] or a [[machine learning]] problem, especially with incomplete or imperfect information or limited computation capacity.<ref name=glover03handbook/><ref name="Bala2015" /><ref name="Bianchi2009" /><ref name="blum03metaheuristics" /> Metaheuristics sample a subset of solutions which is otherwise too large to be completely enumerated or otherwise explored. Metaheuristics may make relatively few assumptions about the optimization problem being solved and so may be usable for a variety of problems.<ref name=glover03handbook/><ref name=":1" /><ref name=":2" /> Their use is always of interest when exact or other (approximate) methods are not available or are not expedient, either because the calculation time is too long or because, for example, the solution provided is too imprecise. Compared to [[optimization algorithm]]s and [[iterative method]]s, metaheuristics do not guarantee that a [[global optimum|globally optimal solution]] can be found on some class of problems.<ref name=blum03metaheuristics /> Many metaheuristics implement some form of [[stochastic optimization]], so that the solution found is dependent on the set of [[random variable]]s generated.<ref name=Bianchi2009 /> In [[combinatorial optimization]], there are many problems that belong to the class of [[NP-completeness|NP-complete]] problems and thus can no longer be solved exactly in an acceptable time from a relatively low degree of complexity.<ref>{{Cite book |last1=Brucker |first1=Peter |url=https://link.springer.com/10.1007/978-3-642-23929-8 |title=Complex Scheduling |last2=Knust |first2=Sigrid |date=2012 |publisher=Springer |isbn=978-3-642-23928-1 |location=Berlin, Heidelberg |language=en |doi=10.1007/978-3-642-23929-8}}</ref><ref>{{Cite book |last1=Papadimitriou |first1=Christos H. |title=Combinatorial Optimization: Algorithms and Complexity |last2=Steiglitz |first2=Kenneth |date=1998 |publisher=Dover Publ., corrected, unabridged new edition of the work published by Prentice-Hall in 1982. |isbn=978-0-486-40258-1 |location=Mineola, N.Y}}</ref> Metaheuristics then often provide good solutions with less computational effort than approximation methods, iterative methods, or simple heuristics.<ref name=blum03metaheuristics /><ref name=glover03handbook/> This also applies in the field of continuous or mixed-integer optimization.<ref name=glover03handbook/><ref name=":5">{{Cite journal |last=Gad |first=Ahmed G. |date=2022 |title=Particle Swarm Optimization Algorithm and Its Applications: A Systematic Review |journal=Archives of Computational Methods in Engineering |language=en |volume=29 |issue=5 |pages=2531–2561 |doi=10.1007/s11831-021-09694-4 |issn=1134-3060|doi-access=free }}</ref><ref>{{Cite journal |last1=Li |first1=Zhenhua |last2=Lin |first2=Xi |last3=Zhang |first3=Qingfu |last4=Liu |first4=Hailin |date=2020 |title=Evolution strategies for continuous optimization: A survey of the state-of-the-art |url=https://linkinghub.elsevier.com/retrieve/pii/S221065021930584X |journal=Swarm and Evolutionary Computation |language=en |volume=56 |pages=100694 |doi=10.1016/j.swevo.2020.100694}}</ref> As such, metaheuristics are useful approaches for optimization problems.<ref name=Bianchi2009 /> Several books and survey papers have been published on the subject.<ref name=Bianchi2009 /><ref name=blum03metaheuristics/><ref name=glover03handbook/><ref name=goldberg89genetic/><ref name=talbi09metaheuristics/> Literature review on metaheuristic optimization,<ref>X. S. Yang, Metaheuristic optimization, Scholarpedia, 6(8):11472 (2011).</ref> suggested that it was Fred Glover who coined the word metaheuristics.<ref>{{cite journal|last1=Glover |first1=Fred |title=Future paths for integer programming and links to artificial intelligence |journal=Computers and Operations Research |date=January 1986 |volume=13 |issue=5 |pages=533–549 |doi=10.1016/0305-0548(86)90048-1 |issn=0305-0548 |url=https://leeds-faculty.colorado.edu/glover/fred%20pubs/174%20-%20Future%20Paths%20for%20Integer%20Programming%20TS.pdf}}</ref> Most literature on metaheuristics is experimental in nature, describing empirical results based on [[computer experiment]]s with the algorithms. But some formal theoretical results are also available, often on [[convergence (mathematics)|convergence]] and the possibility of finding the global optimum.<ref name=blum03metaheuristics /><ref>{{Cite journal |last=Rudolph |first=Günter |date=2001 |title=Self-adaptive mutations may lead to premature convergence |url=https://ieeexplore.ieee.org/document/942534 |journal=IEEE Transactions on Evolutionary Computation |volume=5 |issue=4 |pages=410–414 |doi=10.1109/4235.942534|hdl=2003/5378 |hdl-access=free }}</ref> Also worth mentioning are the [[No free lunch theorem|no-free-lunch theorems]], which state that there can be no metaheuristic that is better than all others for any given problem. Especially since the turn of the millennium, many metaheuristic methods have been published with claims of novelty and practical efficacy. While the field also features high-quality research, many of the more recent publications have been of poor quality; flaws include vagueness, lack of conceptual elaboration, poor experiments, and ignorance of previous literature.<ref name="Sörensen2013">{{cite journal |last=Sörensen |first=Kenneth |year=2015 |title=Metaheuristics—the metaphor exposed |url=https://onlinelibrary.wiley.com/doi/10.1111/itor.12001 |journal=International Transactions in Operational Research |language=en |volume=22 |issue=1 |pages=3–18 |citeseerx=10.1.1.470.3422 |doi=10.1111/itor.12001 |s2cid=14042315}}</ref><ref name=":3">{{Cite web |last1=Brownlee |first1=Alexander |last2=Woodward |first2=John R. |title=Why we fell out of love with algorithms inspired by nature |url=https://theconversation.com/why-we-fell-out-of-love-with-algorithms-inspired-by-nature-42718 |access-date=2024-08-30 |website=The Conversation (website)|date=3 June 2015 }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)