Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Neuroevolution
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Examples== Examples of neuroevolution methods (those with direct encodings are necessarily non-embryogenic): {| class="wikitable" border="1" |- ! Method ! Encoding ! Evolutionary algorithm ! Aspects evolved |- | Neuro-genetic evolution by E. Ronald, 1994<ref>{{citation |contribution=Genetic Lander: An experiment in accurate neuro-genetic control |first1=Edmund |last1=Ronald |first2= March |last2=Schoenauer |title=PPSN III 1994 Parallel Programming Solving from Nature |pages=452–461 |citeseerx=10.1.1.56.3139 |year=1994 }}</ref> | Direct | [[Genetic algorithm]] | Network Weights |- | Cellular Encoding (CE) by F. Gruau, 1994<ref name=gruau94/> | Indirect, embryogenic (grammar tree using [[S-expression]]s) | [[Genetic programming]] | Structure and parameters (simultaneous, complexification) |- | GNARL by Angeline et al., 1994<ref>{{cite journal |last1=Angeline |first1=P.J. |last2=Saunders |first2=G.M. |last3=Pollack |first3=J.B. |title=An evolutionary algorithm that constructs recurrent neural networks |journal=IEEE Transactions on Neural Networks |date=January 1994 |volume=5 |issue=1 |pages=54–65 |doi=10.1109/72.265960 |pmid=18267779 |citeseerx=10.1.1.64.1853 |s2cid=44767 }}</ref> | Direct | [[Evolutionary programming]] | Structure and parameters (simultaneous, complexification) |- | EPNet by Yao and Liu, 1997<ref>{{cite journal |last1=Yao |first1=X. |last2=Liu |first2=Y. |title=A new evolutionary system for evolving artificial neural networks |journal=IEEE Transactions on Neural Networks |date=May 1997 |volume=8 |issue=3 |pages=694–713 |doi=10.1109/72.572107 |pmid=18255671 }}</ref> | Direct | [[Evolutionary programming]] (combined with [[backpropagation]] and [[simulated annealing]]) | Structure and parameters (mixed, complexification and simplification) |- | [[NeuroEvolution of Augmenting Topologies]] (NEAT) by Stanley and Miikkulainen, 2002<ref name=autogenerated1>{{cite web|title=Real-Time Neuroevolution in the NERO Video Game|first1=Kenneth O. |last1=Stanley |first2=Bobby D. |last2=Bryant |first3=Risto |last3=Miikkulainen|url=https://nn.cs.utexas.edu/downloads/papers/stanley.ieeetec05.pdf |date=December 2005 }}</ref><ref>{{cite journal |last1=Stanley |first1=Kenneth O. |last2=Miikkulainen |first2=Risto |title=Evolving Neural Networks through Augmenting Topologies |journal=Evolutionary Computation |date=June 2002 |volume=10 |issue=2 |pages=99–127 |doi=10.1162/106365602320169811 |pmid=12180173 |citeseerx=10.1.1.638.3910 |s2cid=498161 }}</ref> | Direct | [[Genetic algorithm]]. Tracks genes with historical markings to allow crossover between different topologies, protects innovation via speciation. | Structure and parameters |- | [[HyperNEAT|Hypercube-based NeuroEvolution of Augmenting Topologies]] (HyperNEAT) by Stanley, D'Ambrosio, Gauci, 2008<ref name=hyperneat /> | Indirect, non-embryogenic (spatial patterns generated by a [[Compositional pattern-producing network]] (CPPN) within a [[hypercube]] are interpreted as connectivity patterns in a lower-dimensional space) | [[Genetic algorithm]]. The NEAT algorithm (above) is used to evolve the CPPN. | Parameters, structure fixed (functionally fully connected) |- | [[ES-HyperNEAT|Evolvable Substrate Hypercube-based NeuroEvolution of Augmenting Topologies]] (ES-HyperNEAT) by Risi, Stanley 2012<ref name=eshyperalife/> | Indirect, non-embryogenic (spatial patterns generated by a [[Compositional pattern-producing network]] (CPPN) within a [[hypercube]] are interpreted as connectivity patterns in a lower-dimensional space) | [[Genetic algorithm]]. The NEAT algorithm (above) is used to evolve the CPPN. | Parameters and network structure |- | [[Evolutionary Acquisition of Neural Topologies]] (EANT/EANT2) by Kassahun and Sommer, 2005<ref>{{citation|first1=Yohannes |last1=Kassahun |first2=Gerald |last2=Sommer|contribution=Efficient reinforcement learning through evolutionary acquisition of neural topologies|title=13th European Symposium on Artificial Neural Networks |pages= 259–266|location= Bruges, Belgium |date=April 2005 |contribution-url=https://ks.informatik.uni-kiel.de/~yk/ESANN2005EANT.pdf}}</ref> / Siebel and Sommer, 2007<ref>{{cite journal |last1=Siebel |first1=Nils T. |last2=Sommer |first2=Gerald |title=Evolutionary reinforcement learning of artificial neural networks |journal=International Journal of Hybrid Intelligent Systems |date=17 October 2007 |volume=4 |issue=3 |pages=171–183 |doi=10.3233/his-2007-4304 }}</ref> | Direct and indirect, potentially embryogenic (Common Genetic Encoding<ref name=cgegecco />) | [[Evolutionary programming]]/[[Evolution strategies]] | Structure and parameters (separately, complexification) |- | [[Interactively Constrained Neuro-Evolution]] (ICONE) by Rempis, 2012<ref>{{cite thesis |last1=Rempis |first1=Christian Wilhelm |title=Evolving Complex Neuro-Controllers with Interactively Constrained Neuro-Evolution |date=2012 |url=https://osnadocs.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-2012101710370 }}</ref> | Direct, includes constraint masks to restrict the search to specific topology / parameter manifolds. | [[Evolutionary algorithm]]. Uses constraint masks to drastically reduce the search space through exploiting [[domain knowledge]]. | Structure and parameters (separately, complexification, interactive) |- | [[Deus Ex Neural Network]] (DXNN) by Gene Sher, 2012<ref>{{cite book |doi=10.1007/978-1-4614-4463-3 |title=Handbook of Neuroevolution Through Erlang |year=2013 |last1=Sher |first1=Gene I. |isbn=978-1-4614-4462-6 |s2cid=21777855 }}</ref> | Direct/Indirect, includes constraints, local tuning, and allows for evolution to integrate new sensors and actuators. | [[Memetic algorithm]]. Evolves network structure and parameters on different time-scales. | Structure and parameters (separately, complexification, interactive) |- | [[Spectrum-diverse Unified Neuroevolution Architecture]] (SUNA) by Danilo Vasconcellos Vargas, Junichi Murata<ref>{{cite journal|first1=Danilo Vasconcellos |last1=Vargas |first2=Junichi |last2=Murata |journal=IEEE Transactions on Neural Networks and Learning Systems |volume=28 |issue=8 |pages=1759–1773 |title=Spectrum-Diverse Neuroevolution With Unified Neural Models|doi=10.1109/TNNLS.2016.2551748 |pmid=28113564 |year=2019 |arxiv=1902.06703 |bibcode=2019arXiv190206703V |s2cid=206757620 }}</ref> ([https://github.com/zweifel/Physis-Shard Download code]) | Direct, introduces the [[Unified Neural Representation]] (representation integrating most of the neural network features from the literature). | Genetic Algorithm with a diversity preserving mechanism called [[Spectrum-diversity]] that scales well with chromosome size, is problem independent and focus more on obtaining diversity of high level behaviours/approaches. To achieve this diversity the concept of [[chromosome Spectrum]] is introduced and used together with a [[Novelty Map Population]]. | Structure and parameters (mixed, complexification and simplification) |- | [[Modular Agent-Based Evolver]] (MABE) by Clifford Bohm, Arend Hintze, and others.<ref>{{cite journal|first1=Jeffrey |last1=Edlund |first2=Nicolas |last2=Chaumont |first3=Arend |last3=Hintze |first4=Christof |last4=Koch |first5=Giulio |last5=Tononi |first6=Christoph |last6=Adami |journal=PLOS Computational Biology |volume=7 |issue=10 |pages=e1002236 |title=Integrated Information Increases with Fitness in the Evolution of Animats|doi=10.1371/journal.pcbi.1002236 |pmid=22028639 |pmc=3197648 |year=2011 |arxiv=1103.1791 |bibcode=2011PLSCB...7E2236E |doi-access=free }}</ref> ([https://github.com/Hintzelab/MABE Download code]) | Direct or indirect encoding of [[Markov network]]s, Neural Networks, genetic programming, and other arbitrarily customizable controllers. | Provides evolutionary algorithms, genetic programming algorithms, and allows customized algorithms, along with specification of arbitrary constraints. | Evolvable aspects include the neural model and allows for the evolution of morphology and sexual selection among others. |- |Covariance Matrix Adaptation with Hypervolume Sorted Adaptive Grid Algorithm (CMA-HAGA) by Shahin Rostami, and others.<ref>{{cite journal |last1=Rostami |first1=Shahin |last2=Neri |first2=Ferrante |title=A fast hypervolume driven selection mechanism for many-objective optimisation problems |journal=Swarm and Evolutionary Computation |date=June 2017 |volume=34 |pages=50–67 |doi=10.1016/j.swevo.2016.12.002 |hdl=2086/13102 |hdl-access=free }}</ref><ref>{{cite book |doi=10.1109/CIBCB.2017.8058553 |chapter=Multi-objective evolution of artificial neural networks in multi-class medical diagnosis problems with class imbalance |title=2017 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB) |year=2017 |last1=Shenfield |first1=Alex |last2=Rostami |first2=Shahin |pages=1–8 |isbn=978-1-4673-8988-4 |s2cid=22674515 |chapter-url=https://eprints.bournemouth.ac.uk/29999/1/CIBCB2017_fetal.pdf }}</ref> |Direct, includes an [[atavism]] feature which enables traits to disappear and re-appear at different generations. |Multi-Objective [[Evolution strategy|Evolution Strategy]] with [[Preference Articulation]] ([[Computational Steering]]) |Structure, weights, and biases. |- |GACNN evolutionary pressure-driven by Di Biasi et al, <ref>{{Cite book |last1=Di Biasi |first1=Luigi |last2=De Marco |first2=Fabiola |last3=Auriemma Citarella |first3=Alessia |last4=Barra |first4=Paola |last5=Piotto Piotto |first5=Stefano |last6=Tortora |first6=Genoveffa |chapter=Hybrid Approach for the Design of CNNS Using Genetic Algorithms for Melanoma Classification |series=Lecture Notes in Computer Science |date=2023 |volume=13643 |editor-last=Rousseau |editor-first=Jean-Jacques |editor2-last=Kapralos |editor2-first=Bill |title=Pattern Recognition, Computer Vision, and Image Processing. ICPR 2022 International Workshops and Challenges |chapter-url=https://link.springer.com/chapter/10.1007/978-3-031-37660-3_36 |language=en |location=Cham |publisher=Springer Nature Switzerland |pages=514–528 |doi=10.1007/978-3-031-37660-3_36 |isbn=978-3-031-37660-3}}</ref> |Direct |[[Genetic algorithm]], Single-Objective Evolution Strategy, specialized for Convolutional Neural Network |Structure |- |Fast-DENSER by Assunção et al<ref>{{Cite journal |last1=Assunção |first1=Filipe |last2=Lourenço |first2=Nuno |last3=Ribeiro |first3=Bernardete |last4=Machado |first4=Penousal |date=June 2021 |title=Fast-DENSER: Fast Deep Evolutionary Network Structured Representation |url=https://linkinghub.elsevier.com/retrieve/pii/S235271102100039X |journal=SoftwareX |language=en |volume=14 |pages=100694 |doi=10.1016/j.softx.2021.100694|bibcode=2021SoftX..1400694A |hdl=10316/100856 |hdl-access=free }}</ref> and others<ref>{{Citation |last1=Vinhas |first1=Adriano |title=Towards evolution of Deep Neural Networks through contrastive Self-Supervised learning |date=2024-06-20 |arxiv=2406.14525 |last2=Correia |first2=João |last3=Machado |first3=Penousal}}</ref><ref>{{Citation |last1=Cortês |first1=Gabriel |title=Towards Physical Plausibility in Neuroevolution Systems |date=2024 |work=Applications of Evolutionary Computation |volume=14635 |pages=76–90 |editor-last=Smith |editor-first=Stephen |url=https://link.springer.com/10.1007/978-3-031-56855-8_5 |access-date=2024-06-21 |place=Cham |publisher=Springer Nature Switzerland |language=en |doi=10.1007/978-3-031-56855-8_5 |isbn=978-3-031-56854-1 |last2=Lourenço |first2=Nuno |last3=Machado |first3=Penousal |editor2-last=Correia |editor2-first=João |editor3-last=Cintrano |editor3-first=Christian|url-access=subscription }}</ref> |Indirect |[[Grammatical evolution]] (Dynamic Structured Grammar Evolution)<ref>{{Citation |last1=Lourenço |first1=Nuno |title=Structured Grammatical Evolution: A Dynamic Approach |date=2018 |work=Handbook of Grammatical Evolution |pages=137–161 |editor-last=Ryan |editor-first=Conor |url=https://doi.org/10.1007/978-3-319-78717-6_6 |access-date=2024-06-21 |place=Cham |publisher=Springer International Publishing |language=en |doi=10.1007/978-3-319-78717-6_6 |isbn=978-3-319-78717-6 |last2=Assunção |first2=Filipe |last3=Pereira |first3=Francisco B. |last4=Costa |first4=Ernesto |last5=Machado |first5=Penousal |editor2-last=O'Neill |editor2-first=Michael |editor3-last=Collins |editor3-first=JJ|url-access=subscription }}</ref> |Structure and optimiser used for training |}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)