Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Learning classifier system
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== In the wake of XCS === XCS inspired the development of a whole new generation of LCS algorithms and applications. In 1995, Congdon was the first to apply LCS to real-world [[Epidemiology|epidemiological]] investigations of disease <ref name=":8" /> followed closely by Holmes who developed the '''BOOLE++''',<ref>{{Cite journal|last=Holmes|first=John H.|date=1996-01-01|title=A Genetics-Based Machine Learning Approach to Knowledge Discovery in Clinical Data|journal=Proceedings of the AMIA Annual Fall Symposium|pages=883|issn=1091-8280|pmc=2233061}}</ref> '''EpiCS''',<ref>Holmes, John H. "[https://web.archive.org/web/20180820234915/https://pdfs.semanticscholar.org/71e4/eb6c630dee4b762e74b2970f6dc638a351ab.pdf Discovering Risk of Disease with a Learning Classifier System]." In ''ICGA'', pp. 426-433. 1997.</ref> and later '''EpiXCS'''<ref>Holmes, John H., and Jennifer A. Sager. "[https://link.springer.com/10.1007%2F11527770_60 Rule discovery in epidemiologic surveillance data using EpiXCS: an evolutionary computation approach]." In''Conference on Artificial Intelligence in Medicine in Europe'', pp. 444-452. Springer Berlin Heidelberg, 2005.</ref> for [[Epidemiology|epidemiological]] classification. These early works inspired later interest in applying LCS algorithms to complex and large-scale [[data mining]] tasks epitomized by [[bioinformatics]] applications. In 1998, Stolzmann introduced '''anticipatory classifier systems (ACS)''' which included rules in the form of 'condition-action-effect, rather than the classic 'condition-action' representation.<ref name=":7" /> ACS was designed to predict the perceptual consequences of an action in all possible situations in an environment. In other words, the system evolves a model that specifies not only what to do in a given situation, but also provides information of what will happen after a specific action will be executed. This family of LCS algorithms is best suited to multi-step problems, planning, speeding up learning, or disambiguating perceptual aliasing (i.e. where the same observation is obtained in distinct states but requires different actions). Butz later pursued this anticipatory family of LCS developing a number of improvements to the original method.<ref>Butz, Martin V. "[https://web.archive.org/web/20180820234943/https://pdfs.semanticscholar.org/3572/7a56fcce7a73ccc43e5bfa19389780e6d436.pdf Biasing exploration in an anticipatory learning classifier system]." In ''International Workshop on Learning Classifier Systems'', pp. 3-22. Springer Berlin Heidelberg, 2001.</ref> In 2002, Wilson introduced '''XCSF''', adding a computed action in order to perform function approximation.<ref>{{Cite journal|last=Wilson|first=Stewart W.|title=Classifiers that approximate functions|journal=Natural Computing|language=en|volume=1|issue=2–3|pages=211–234|doi=10.1023/A:1016535925043|issn=1567-7818|year=2002|s2cid=23032802}}</ref> In 2003, Bernado-Mansilla introduced a '''sUpervised Classifier System (UCS)''', which specialized the XCS algorithm to the task of [[supervised learning]], single-step problems, and forming a best action set. UCS removed the [[reinforcement learning]] strategy in favor of a simple, accuracy-based rule fitness as well as the explore/exploit learning phases, characteristic of many reinforcement learners. Bull introduced a simple accuracy-based LCS '''(YCS)'''<ref>Bull, Larry. "[https://web.archive.org/web/20180820234941/https://pdfs.semanticscholar.org/120c/8f5057995c36ee60ec320c2263b20af05444.pdf A simple accuracy-based learning classifier system]." ''Learning Classifier Systems Group Technical Report UWELCSG03-005, University of the West of England, Bristol, UK'' (2003).</ref> and a simple strength-based LCS '''Minimal Classifier System (MCS)'''<ref>Bull, Larry. "[https://link.springer.com/chapter/10.1007/978-3-540-30217-9_104 A simple payoff-based learning classifier system]." In''International Conference on Parallel Problem Solving from Nature'', pp. 1032-1041. Springer Berlin Heidelberg, 2004.</ref> in order to develop a better theoretical understanding of the LCS framework. Bacardit introduced '''GAssist'''<ref>Peñarroya, Jaume Bacardit. "Pittsburgh genetic-based machine learning in the data mining era: representations, generalization, and run-time." PhD diss., Universitat Ramon Llull, 2004.</ref> and '''BioHEL''',<ref>{{Cite journal|last1=Bacardit|first1=Jaume|last2=Burke|first2=Edmund K.|last3=Krasnogor|first3=Natalio|date=2008-12-12|title=Improving the scalability of rule-based evolutionary learning|journal=Memetic Computing|language=en|volume=1|issue=1|pages=55–67|doi=10.1007/s12293-008-0005-4|s2cid=775199|issn=1865-9284}}</ref> Pittsburgh-style LCSs designed for [[data mining]] and [[scalability]] to large datasets in [[bioinformatics]] applications. In 2008, Drugowitsch published the book titled "Design and Analysis of Learning Classifier Systems" including some theoretical examination of LCS algorithms.<ref>{{Cite book|title=Design and Analysis of Learning Classifier Systems - Springer|volume = 139|doi=10.1007/978-3-540-79866-8|series = Studies in Computational Intelligence|year = 2008|isbn = 978-3-540-79865-1|last1 = Drugowitsch|first1 = Jan}}</ref> Butz introduced the first rule online learning visualization within a [[Graphical user interface|GUI]] for XCSF<ref name=":9" /> (see the image at the top of this page). Urbanowicz extended the UCS framework and introduced '''ExSTraCS,''' explicitly designed for [[supervised learning]] in noisy problem domains (e.g. epidemiology and bioinformatics).<ref>Urbanowicz, Ryan J., Gediminas Bertasius, and Jason H. Moore. "[http://www.seas.upenn.edu/~gberta/uploads/3/1/4/8/31486883/urbanowicz_2014_exstracs_algorithm.pdf An extended michigan-style learning classifier system for flexible supervised learning, classification, and data mining]." In ''International Conference on Parallel Problem Solving from Nature'', pp. 211-221. Springer International Publishing, 2014.</ref> ExSTraCS integrated (1) expert knowledge to drive covering and genetic algorithm towards important features in the data,<ref>Urbanowicz, Ryan J., Delaney Granizo-Mackenzie, and Jason H. Moore. "[https://web.archive.org/web/20180820234834/https://pdfs.semanticscholar.org/b407/8f8bb6aa9e39e84b0b20874662a6ed8b7df1.pdf Using expert knowledge to guide covering and mutation in a michigan style learning classifier system to detect epistasis and heterogeneity]." In''International Conference on Parallel Problem Solving from Nature'', pp. 266-275. Springer Berlin Heidelberg, 2012.</ref> (2) a form of long-term memory referred to as attribute tracking,<ref>{{Cite book|last1=Urbanowicz|first1=Ryan|last2=Granizo-Mackenzie|first2=Ambrose|last3=Moore|first3=Jason|title=Proceedings of the 14th annual conference on Genetic and evolutionary computation |chapter=Instance-linked attribute tracking and feedback for michigan-style supervised learning classifier systems |date=2012-01-01|series=GECCO '12|location=New York, NY, USA|publisher=ACM|pages=927–934|doi=10.1145/2330163.2330291|isbn=9781450311779|s2cid=142534}}</ref> allowing for more efficient learning and the characterization of heterogeneous data patterns, and (3) a flexible rule representation similar to Bacardit's mixed discrete-continuous attribute list representation.<ref>{{Cite book|last1=Bacardit|first1=Jaume|last2=Krasnogor|first2=Natalio|title=Proceedings of the 11th Annual conference on Genetic and evolutionary computation |chapter=A mixed discrete-continuous attribute list representation for large scale classification domains |date=2009-01-01|series=GECCO '09|location=New York, NY, USA|publisher=ACM|pages=1155–1162|doi=10.1145/1569901.1570057|isbn=9781605583259|citeseerx=10.1.1.158.7314|s2cid=10906515}}</ref> Both Bacardit and Urbanowicz explored statistical and visualization strategies to interpret LCS rules and perform knowledge discovery for data mining.<ref name=":11">{{Cite journal|last1=Urbanowicz|first1=R. J.|last2=Granizo-Mackenzie|first2=A.|last3=Moore|first3=J. H.|date=2012-11-01|title=An analysis pipeline with statistical and visualization-guided knowledge discovery for Michigan-style learning classifier systems|journal=IEEE Computational Intelligence Magazine|volume=7|issue=4|pages=35–45|doi=10.1109/MCI.2012.2215124|issn=1556-603X|pmc=4244006|pmid=25431544}}</ref><ref name=":12">{{cite journal | last1 = Bacardit | first1 = Jaume | last2 = Llorà | first2 = Xavier | year = 2013 | title = Large-scale data mining using genetics-based machine learning | journal = Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery | volume = 3 | issue = 1| pages = 37–61 | doi=10.1002/widm.1078| s2cid = 43062613 }}</ref> Browne and Iqbal explored the concept of reusing building blocks in the form of code fragments and were the first to solve the 135-bit multiplexer benchmark problem by first learning useful building blocks from simpler multiplexer problems.<ref>{{Cite journal|last1=Iqbal|first1=Muhammad|last2=Browne|first2=Will N.|last3=Zhang|first3=Mengjie|date=2014-08-01|title=Reusing Building Blocks of Extracted Knowledge to Solve Complex, Large-Scale Boolean Problems|journal=IEEE Transactions on Evolutionary Computation|pages=465–480|volume=18|issue=4|doi=10.1109/tevc.2013.2281537|s2cid=525358}}</ref> '''ExSTraCS 2.0''' was later introduced to improve Michigan-style LCS scalability, successfully solving the 135-bit multiplexer benchmark problem for the first time directly.<ref name=":0"/> The n-bit [[multiplexer]] problem is highly [[Epistasis|epistatic]] and [[Homogeneity and heterogeneity|heterogeneous]], making it a very challenging [[machine learning]] task.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)