Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Hopfield network
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Capacity== The Network capacity of the Hopfield network model is determined by neuron amounts and connections within a given network. Therefore, the number of memories that are able to be stored is dependent on neurons and connections. Furthermore, it was shown that the recall accuracy between vectors and nodes was 0.138 (approximately 138 vectors can be recalled from storage for every 1000 nodes) (Hertz et al., 1991). Therefore, it is evident that many mistakes will occur if one tries to store a large number of vectors. When the Hopfield model does not recall the right pattern, it is possible that an intrusion has taken place, since semantically related items tend to confuse the individual, and recollection of the wrong pattern occurs. Therefore, the Hopfield network model is shown to confuse one stored item with that of another upon retrieval. Perfect recalls and high capacity, >0.14, can be loaded in the network by Storkey learning method; ETAM,<ref>{{cite journal | last1=Liou | first1=C.-Y. | last2=Lin | first2=S.-L. | s2cid=35025761 | title=Finite memory loading in hairy neurons |journal=Natural Computing |volume=5 |issue=1 |pages=15β42 |year=2006 |doi=10.1007/s11047-004-5490-x |url=http://ntur.lib.ntu.edu.tw//bitstream/246246/155192/1/19.pdf }}</ref><ref>{{cite journal | last1=Liou | first1=C.-Y. | last2=Yuan | first2=S.-K. | s2cid=6168346 | title=Error Tolerant Associative Memory |journal=Biological Cybernetics |volume=81 | issue=4 | pages=331β342 |year=1999 |doi=10.1007/s004220050566 | pmid=10541936 }}</ref> ETAM experiments also in.<ref>{{cite thesis | last=Yuan | first=S.-K. | title=Expanding basins of attraction of the associative memory | institution=National Taiwan University | degree=Master | id=991010725609704786 | date=June 1997 | url=https://ntu.primo.exlibrisgroup.com/discovery/fulldisplay?vid=886NTU_INST:886NTU_INST&search_scope=MyInstitution&tab=LibraryCatalog&docid=alma991010725609704786&lang=en&context=L&adaptor=Local%20Search%20Engine&query=any,contains,%E5%8A%89%E9%95%B7%E9%81%A0&facet=rtype,include,manuscripts&facet=searchcreationdate,include,1977%7C,%7C2019&facet=searchcreationdate,include,1992%7C,%7C2010&offset=0}}</ref> Ulterior models inspired by the Hopfield network were later devised to raise the storage limit and reduce the retrieval error rate, with some being capable of [[One-shot learning (computer vision)|one-shot learning]].<ref>{{cite book |last1=ABOUDIB |first1=Ala |last2=GRIPON |first2=Vincent |last3=JIANG |first3=Xiaoran |chapter=A study of retrieval algorithms of sparse messages in networks of neural cliques |title=COGNITIVE 2014 : The 6th International Conference on Advanced Cognitive Technologies and Applications |date=2014 |pages=140β6 |arxiv=1308.4506 |bibcode=2013arXiv1308.4506A |chapter-url=https://hal.archives-ouvertes.fr/hal-01058303/}}</ref> The storage capacity can be given as <math>C \cong \frac{n}{2\log_2n}</math> where <math>n</math> is the number of neurons in the net.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)