Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Hopfield network
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Convergence properties of discrete and continuous Hopfield networks == [https://www.paradise.caltech.edu/bruck.html Bruck] in his paper in 1990<ref name="Bruck1990">{{cite journal |last=Bruck |first=J. |date=October 1990 |title=On the convergence properties of the Hopfield model |url=https://resolver.caltech.edu/CaltechAUTHORS:20120426-132042598 |journal=Proc. IEEE |volume=78 |issue=10 |pages=1579β85 |doi=10.1109/5.58341}}</ref>Β studied discrete Hopfield networks and proved a generalized convergence theorem that is based on the connection between the network's dynamics and [[Cut (graph theory)|cuts in the associated graph]]. This generalization covered both asynchronous as well as synchronous dynamics and presented elementary proofs based on greedy algorithms for [[Maximum cut|max-cut in graphs]]. A subsequent paper<ref name="Uykan2019">{{cite journal |last=Uykan |first=Z. |date=September 2020 |title=On the Working Principle of the Hopfield Neural Networks and its Equivalence to the GADIA in Optimization |url=https://ieeexplore.ieee.org/document/8859641 |journal=IEEE Transactions on Neural Networks and Learning Systems |volume=31 |issue=9 |pages=3294β3304 |doi=10.1109/TNNLS.2019.2940920 |pmid=31603804 |s2cid=204331533|url-access=subscription }}</ref> further investigated the behavior of any neuron in both discrete-time and continuous-time Hopfield networks when the corresponding energy function is minimized during an optimization process. Bruck showed<ref name="Bruck1990"/> that neuron ''j'' changes its state ''if and only if'' it further decreases the following biased pseudo-cut. The discrete Hopfield network minimizes the following biased pseudo-cut<ref name="Uykan2019"/> for the synaptic weight matrix of the Hopfield net. <math> J_{pseudo-cut}(k) = \sum_{i \in C_1(k)} \sum_{j \in C_2(k)} w_{ij} + \sum_{j \in C_1(k)} {\theta_j} </math> where <math> C_1(k) </math> and <math> C_2(k) </math> represents the set of neurons which are β1 and +1, respectively, at time <math> k </math>. For further details, see the recent paper.<ref name="Uykan2019"/> The discrete-time Hopfield Network always minimizes exactly the following pseudo-cut<ref name="Bruck1990" /><ref name="Uykan2019" /> : <math> U(k) = \sum_{i=1}^N \sum_{j=1}^{N} w_{ij} ( s_i(k) - s_j(k) )^2 + 2 \sum_{j=1}^N \theta_j s_j(k) </math> The continuous-time Hopfield network always minimizes an upper bound to the following weighted cut<ref name="Uykan2019" /> : <math> V(t) = \sum_{i=1}^N \sum_{j=1}^N w_{ij} ( f(s_i(t)) - f(s_j(t) )^2 + 2 \sum_{j=1}^N \theta_j f(s_j(t)) </math> where <math> f(\cdot) </math> is a zero-centered sigmoid function. The complex Hopfield network, on the other hand, generally tends to minimize the so-called shadow-cut of the complex weight matrix of the net.<ref name="Uykan2020">{{cite journal |first=Z. |last=Uykan |title=Shadow-Cuts Minimization/Maximization and Complex Hopfield Neural Networks |journal=IEEE Transactions on Neural Networks and Learning Systems |volume=32 |issue=3 |pages=1096β1109 |date=March 2021 |doi=10.1109/TNNLS.2020.2980237 |pmid=32310787 |s2cid=216047831 |doi-access=free }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)