Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Hopfield network
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Structure== [[Image:Hopfield-net-vector.svg|thumb|A Hopfield net with four units]] The units in Hopfield nets are binary threshold units, i.e. the units only take on two different values for their states, and the value is determined by whether or not the unit's input exceeds its threshold <math> U_i </math>. Discrete Hopfield nets describe relationships between binary (firing or not-firing) neurons <math>1,2,\ldots,i,j,\ldots,N</math>.<ref name="Hopfield1982" /> At a certain time, the state of the neural net is described by a vector <math> V </math>, which records which neurons are firing in a binary word of <math> N </math> bits. The interactions <math> w_{ij} </math> between neurons have units that usually take on values of 1 or −1, and this convention will be used throughout this article. However, other literature might use units that take values of 0 and 1. These interactions are "learned" via [[Hebbian theory|Hebb's law of association]], such that, for a certain state <math> V^s </math> and distinct nodes <math>i,j</math> <math> w_{ij} = V_i^s V_j^s </math> but <math> w_{ii} = 0 </math>. (Note that the Hebbian learning rule takes the form <math> w_{ij} = (2V_i^s - 1)(2V_j^s -1) </math> when the units assume values in <math> \{0, 1\} </math>.) Once the network is trained, <math> w_{ij} </math> no longer evolve. If a new state of neurons <math> V^{s'} </math> is introduced to the neural network, the net acts on neurons such that * <math>V^{s'}_i \rightarrow 1 </math> if <math> \sum_j w_{ij} V^{s'}_j \ge U_i </math> * <math>V^{s'}_i \rightarrow -1 </math> if <math> \sum_j w_{ij} V^{s'}_j < U_i </math> where <math>U_i</math> is the threshold value of the i'th neuron (often taken to be 0).<ref>{{cite journal |last1=Hopfield |first1=J. J. |title=Neural networks and physical systems with emergent collective computational abilities |journal= Proceedings of the National Academy of Sciences|date= 1982 |volume=79 |issue=8 |pages=2554–2558 |doi=10.1073/pnas.79.8.2554 |pmid=6953413 |pmc=346238 |bibcode=1982PNAS...79.2554H |doi-access=free }}</ref> In this way, Hopfield networks have the ability to "remember" states stored in the interaction matrix, because if a new state <math>V^{s'} </math> is subjected to the interaction matrix, each neuron will change until it matches the original state <math>V^{s} </math> (see the Updates section below). The connections in a Hopfield net typically have the following restrictions: * <math>w_{ii}=0, \forall i</math> (no unit has a connection with itself) * <math>w_{ij} = w_{ji}, \forall i,j</math> (connections are symmetric) The constraint that weights are symmetric guarantees that the energy function decreases monotonically while following the activation rules.<ref>{{cite book|last=MacKay|first=David J. C.|author-link=David J.C. MacKay|title=Information Theory, Inference and Learning Algorithms|url=https://archive.org/details/informationtheor00mack_607|url-access=limited|chapter=42. Hopfield Networks|year=2003|publisher=[[Cambridge University Press]]|page=[https://archive.org/details/informationtheor00mack_607/page/n519 508]|quote=This convergence proof depends crucially on the fact that the Hopfield network's connections are ''symmetric''. It also depends on the updates being made asynchronously.|isbn=978-0521642989}}</ref> A network with asymmetric weights may exhibit some periodic or chaotic behaviour; however, Hopfield found that this behavior is confined to relatively small parts of the phase space and does not impair the network's ability to act as a content-addressable associative memory system. Hopfield also modeled neural nets for continuous values, in which the electric output of each neuron is not binary but some value between 0 and 1.<ref name=":0">{{cite journal |last1=Hopfield |first1=J. J. |title=Neurons with graded response have collective computational properties like those of two-state neurons |journal= Proceedings of the National Academy of Sciences|date= 1984 |volume=81 |issue=10 |pages=3088–3092 |doi=10.1073/pnas.81.10.3088 |pmid=6587342 |pmc=345226 |bibcode=1984PNAS...81.3088H |doi-access=free }}</ref> He found that this type of network was also able to store and reproduce memorized states. Notice that every pair of units ''i'' and ''j'' in a Hopfield network has a connection that is described by the connectivity weight <math> w_{ij} </math>. In this sense, the Hopfield network can be formally described as a complete undirected graph <math> G = \langle V, f\rangle </math>, where <math>V</math> is a set of [[Artificial neuron|McCulloch–Pitts neurons]] and <math>f:V^2 \rightarrow \mathbb R</math> is a function that links pairs of units to a real value, the connectivity weight.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)