Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Connectionism
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==The second wave== The second wave begun in the early 1980s. Some key publications included ([[John Hopfield]], 1982)<ref>{{Cite journal |last=Hopfield |first=J J |date=April 1982 |title=Neural networks and physical systems with emergent collective computational abilities. |journal=Proceedings of the National Academy of Sciences |language=en |volume=79 |issue=8 |pages=2554β2558 |doi=10.1073/pnas.79.8.2554 |doi-access=free |issn=0027-8424 |pmc=346238 |pmid=6953413|bibcode=1982PNAS...79.2554H }}</ref> which popularized [[Hopfield network|Hopfield networks]], the 1986 paper that popularized backpropagation,<ref>{{Cite journal |last1=Rumelhart |first1=David E. |last2=Hinton |first2=Geoffrey E. |last3=Williams |first3=Ronald J. |date=October 1986 |title=Learning representations by back-propagating errors |url=https://www.nature.com/articles/323533a0 |journal=Nature |language=en |volume=323 |issue=6088 |pages=533β536 |doi=10.1038/323533a0 |bibcode=1986Natur.323..533R |issn=1476-4687|url-access=subscription }}</ref> and the 1987 two-volume book about the ''Parallel Distributed Processing'' (PDP) by [[James L. McClelland]], [[David E. Rumelhart]] et al., which has introduced a couple of improvements to the simple perceptron idea, such as intermediate processors (known as "[[hidden layers]]" now) alongside input and output units and using [[Sigmoid function|sigmoid]] [[activation function]] instead of the old 'all-or-nothing' function. Hopfield approached the field from the perspective of statistical mechanics, providing some early forms of mathematical rigor that increased the perceived respectability of the field.<ref name="2019TheCuriousCaseOfConnectionism" /> Another important series of publications proved that neural networks are [[Universal approximation theorem|universal function approximators]], which also provided some mathematical respectability.<ref>{{Cite journal |last=Cybenko |first=G. |date=1989-12-01 |title=Approximation by superpositions of a sigmoidal function |url=https://doi.org/10.1007/BF02551274 |journal=Mathematics of Control, Signals and Systems |language=en |volume=2 |issue=4 |pages=303β314 |doi=10.1007/BF02551274 |bibcode=1989MCSS....2..303C |issn=1435-568X}}</ref> Some early popular demonstration projects appeared during this time. [[NETtalk (artificial neural network)|NETtalk]] (1987) learned to pronounce written English. It achieved popular success, appearing on the [[Today (American TV program)|''Today'' show]].<ref name=":02">{{Cite book |last=Sejnowski |first=Terrence J. |title=The deep learning revolution |date=2018 |publisher=The MIT Press |isbn=978-0-262-03803-4 |location=Cambridge, Massachusetts London, England}}</ref> [[TD-Gammon]] (1992) reached top human level in [[backgammon]].<ref>{{Citation |title=TD-Gammon |date=2010 |pages=955β956 |editor-last=Sammut |editor-first=Claude |url=https://doi.org/10.1007/978-0-387-30164-8_813 |access-date=2023-12-25 |place=Boston, MA |publisher=Springer US |language=en |doi=10.1007/978-0-387-30164-8_813 |isbn=978-0-387-30164-8 |editor2-last=Webb |editor2-first=Geoffrey I. |encyclopedia=Encyclopedia of Machine Learning|url-access=subscription }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)