Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Hopfield network
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== History == {{Primary sources|date=October 2024}} One origin of associative memory is human [[cognitive psychology]], specifically the [[Associative memory (psychology)|associative memory]]. [[Frank Rosenblatt]] studied "close-loop cross-coupled perceptrons", which are 3-layered [[perceptron]] networks whose middle layer contains recurrent connections that change by a [[Hebbian theory|Hebbian learning]] rule.<ref>F. Rosenblatt, "[[iarchive:SelfOrganizingSystems/page/n87/mode/1up|Perceptual Generalization over Transformation Groups]]", pp. 63--100 in ''Self-organizing Systems: Proceedings of an Inter-disciplinary Conference, 5 and 6 May 1959''. Edited by Marshall C. Yovitz and Scott Cameron. London, New York, [etc.], Pergamon Press, 1960. ix, 322 p.</ref>{{Pg|pages=73-75}}<ref name=":12">{{Cite book |last=Rosenblatt |first=Frank |url=https://archive.org/details/DTIC_AD0256582/page/n3/mode/2up |title=DTIC AD0256582: PRINCIPLES OF NEURODYNAMICS. PERCEPTRONS AND THE THEORY OF BRAIN MECHANISMS |date=1961-03-15 |publisher=Defense Technical Information Center |language=english}}</ref>{{Pg|location=Chapter 19, 21}} Another model of associative memory is where the output does not loop back to the input. W. K. Taylor proposed such a model trained by Hebbian learning in 1956.<ref>W. K. Taylor, 1956. ''Electrical simulation of some nervous system functional activities''. Information Theory 3, E. C. Cherry (ed.), pp. 314-328. London: Butterworths.</ref> [[Karl Steinbuch]], who wanted to understand learning, and inspired by watching his children learn,<ref>''[https://www.itiv.kit.edu/downloads/euology_1_.pdf Eulogy: 1917 Karl Steinbuch 2005]'', by Bernard Widrow, Reiner Hartenstein, Robert Hecht-Nielsen, IEEE Computational Intelligence Society. page 5. August 2005.</ref> published the [[Lernmatrix]] in 1961.<ref>{{Cite journal |last=Steinbuch |first=K. |date=1961-01-01 |title=Die Lernmatrix |url=https://link.springer.com/article/10.1007/BF00293853 |journal=Kybernetik |language=de |volume=1 |issue=1 |pages=36–45 |doi=10.1007/BF00293853 |issn=1432-0770|url-access=subscription }}</ref><ref>{{Cite book |last=Steinbuch |first=Karl |url=https://openlibrary.org/books/OL27019478M/Automat_und_Mensch |title=Automat und Mensch: über menschliche und maschinelle Intelligenz |date=1961 |publisher=Springer |isbn=978-3-642-53168-2 |location=Berlin|ol=27019478M }}</ref> It was translated to English in 1963.<ref>{{Cite journal |last1=Steinbuch |first1=K. |last2=Piske |first2=U. A. W. |date=December 1963 |title=Learning matrices and their applications |url=https://ieeexplore.ieee.org/document/4038032 |journal=IEEE Transactions on Electronic Computers |volume=EC-12 |issue=6 |pages=846–862 |doi=10.1109/PGEC.1963.263588 |issn=0367-7508|url-access=subscription }}</ref> Similar research was done with the ''correlogram'' of D. J. Willshaw et al. in 1969.<ref>{{Cite journal |last1=Willshaw |first1=D. J. |last2=Buneman |first2=O. P. |last3=Longuet-Higgins |first3=H. C. |date=June 1969 |title=Non-Holographic Associative Memory |url=https://www.nature.com/articles/222960a0 |journal=Nature |volume=222 |issue=5197 |pages=960–962 |doi=10.1038/222960a0 |pmid=5789326 |bibcode=1969Natur.222..960W |issn=0028-0836|url-access=subscription }}</ref> [[Teuvo Kohonen]] trained an associative memory by gradient descent in 1974.<ref>{{Cite journal |last=Kohonen |first=T. |date=April 1974 |title=An Adaptive Associative Memory Principle |url=https://ieeexplore.ieee.org/document/1672553 |journal=IEEE Transactions on Computers |volume=C-23 |issue=4 |pages=444–445 |doi=10.1109/T-C.1974.223960 |issn=0018-9340|url-access=subscription }}</ref>[[File:Typical_connections_in_a_close-loop_cross-coupled_perceptron.png|thumb|A close-loop cross-coupled perceptron network. ''Principles of Neurodynamics'' (1961){{Pg|page=403|location=Fig. 47}}.]] Another origin of associative memory was [[statistical mechanics]]. The Ising model was published in 1920s as a model of magnetism, however it studied the thermal equilibrium, which does not change with time. [[Roy J. Glauber]] in 1963 studied the Ising model evolving in time, as a process towards thermal equilibrium ([[Glauber dynamics]]), adding in the component of time.<ref name=":22">{{cite journal |last1=Glauber |first1=Roy J. |date=February 1963 |title=Roy J. Glauber "Time-Dependent Statistics of the Ising Model" |url=https://aip.scitation.org/doi/abs/10.1063/1.1703954 |journal=Journal of Mathematical Physics |volume=4 |issue=2 |pages=294–307 |doi=10.1063/1.1703954 |access-date=2021-03-21|url-access=subscription }}</ref> The second component to be added was adaptation to stimulus. Described independently by Kaoru Nakano in 1971<ref name="Nakano1971">{{cite book |last1=Nakano |first1=Kaoru |title=Pattern Recognition and Machine Learning |date=1971 |isbn=978-1-4615-7568-9 |pages=172–186 |chapter=Learning Process in a Model of Associative Memory |doi=10.1007/978-1-4615-7566-5_15}}</ref><ref name="Nakano1972">{{cite journal |last1=Nakano |first1=Kaoru |date=1972 |title=Associatron-A Model of Associative Memory |journal=IEEE Transactions on Systems, Man, and Cybernetics |volume=SMC-2 |issue=3 |pages=380–388 |doi=10.1109/TSMC.1972.4309133}}</ref> and [[Shun'ichi Amari]] in 1972,<ref name="Amari1972">{{cite journal |last1=Amari |first1=Shun-Ichi |date=1972 |title=Learning patterns and pattern sequences by self-organizing nets of threshold elements |journal=IEEE Transactions |volume=C |issue=21 |pages=1197–1206}}</ref> they proposed to modify the weights of an Ising model by [[Hebbian theory|Hebbian learning]] rule as a model of associative memory. The same idea was published by {{ill|William A. Little (physicist)|lt=William A. Little|de|William A. Little}} in 1974,<ref name="little74">{{cite journal |last=Little |first=W. A. |year=1974 |title=The Existence of Persistent States in the Brain |journal=Mathematical Biosciences |volume=19 |issue=1–2 |pages=101–120 |doi=10.1016/0025-5564(74)90031-5}}</ref> who was acknowledged by Hopfield in his 1982 paper. See Carpenter (1989)<ref>{{Cite journal |last=Carpenter |first=Gail A |date=1989-01-01 |title=Neural network models for pattern recognition and associative memory |url=https://www.sciencedirect.com/science/article/abs/pii/089360808990035X |journal=Neural Networks |volume=2 |issue=4 |pages=243–257 |doi=10.1016/0893-6080(89)90035-X |issn=0893-6080|url-access=subscription }}</ref> and Cowan (1990)<ref>{{Cite journal |last=Cowan |first=Jack D. |date=January 1990 |title=Discussion: McCulloch-Pitts and related neural nets from 1943 to 1989 |url=http://link.springer.com/10.1007/BF02459569 |journal=Bulletin of Mathematical Biology |language=en |volume=52 |issue=1–2 |pages=73–97 |doi=10.1007/BF02459569 |issn=0092-8240|url-access=subscription }}</ref> for a technical description of some of these early works in associative memory. The [[Spin glass#Sherrington–Kirkpatrick model|Sherrington–Kirkpatrick model]] of spin glass, published in 1975,<ref>{{Cite journal |last1=Sherrington |first1=David |last2=Kirkpatrick |first2=Scott |date=1975-12-29 |title=Solvable Model of a Spin-Glass |url=https://link.aps.org/doi/10.1103/PhysRevLett.35.1792 |journal=Physical Review Letters |volume=35 |issue=26 |pages=1792–1796 |doi=10.1103/PhysRevLett.35.1792 |bibcode=1975PhRvL..35.1792S |issn=0031-9007|url-access=subscription }}</ref> is the Hopfield network with random initialization. Sherrington and Kirkpatrick found that it is highly likely for the energy function of the SK model to have many local minima. In the 1982 paper, Hopfield applied this recently developed theory to study the Hopfield network with binary activation functions.<ref name="Hopfield1982">{{cite journal |last1=Hopfield |first1=J. J. |date=1982 |title=Neural networks and physical systems with emergent collective computational abilities |journal=Proceedings of the National Academy of Sciences |volume=79 |issue=8 |pages=2554–2558 |bibcode=1982PNAS...79.2554H |doi=10.1073/pnas.79.8.2554 |pmc=346238 |pmid=6953413 |doi-access=free}}</ref> In a 1984 paper he extended this to continuous activation functions.<ref name=":0" /> It became a standard model for the study of neural networks through statistical mechanics.<ref>{{Cite book |last1=Engel |first1=A. |title=Statistical mechanics of learning |last2=Broeck |first2=C. van den |date=2001 |publisher=Cambridge University Press |isbn=978-0-521-77307-2 |location=Cambridge, UK; New York, NY}}</ref><ref>{{Cite journal |last1=Seung |first1=H. S. |last2=Sompolinsky |first2=H. |last3=Tishby |first3=N. |date=1992-04-01 |title=Statistical mechanics of learning from examples |url=https://journals.aps.org/pra/abstract/10.1103/PhysRevA.45.6056 |journal=Physical Review A |volume=45 |issue=8 |pages=6056–6091 |doi=10.1103/PhysRevA.45.6056|pmid=9907706 |bibcode=1992PhRvA..45.6056S |url-access=subscription }}</ref> A major advance in memory storage capacity was developed by Dimitry Krotov and Hopfield in 2016<ref name=":1" /> through a change in network dynamics and energy function. This idea was further extended by Demircigil and collaborators in 2017.<ref name=":2" /> The continuous dynamics of large memory capacity models was developed in a series of papers between 2016 and 2020.<ref name=":1" /><ref name=":3" /><ref name=":4" /> Large memory storage capacity Hopfield Networks are now called Dense Associative Memories or [[modern Hopfield network]]s. In 2024, John J. Hopfield and [[Geoffrey E. Hinton]] were awarded the [[Nobel Prize in Physics]] for their foundational contributions to machine learning, such as the Hopfield network.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)