Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Hebbian theory
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Contemporary developments, artificial intelligence, and computational advancements == Modern research has expanded upon Hebb's original ideas. [[Spike-timing-dependent plasticity]] (STDP), for example, refines Hebbian principles by incorporating the precise timing of neuronal spikes to Hebbian theory. Experimental advancements have also linked Hebbian learning to complex behaviors, such as decision-making and emotional regulation.<ref name=":3">{{Citation |title=Hebbian models |date=2002 |work=Spiking Neuron Models: Single Neurons, Populations, Plasticity |pages=351–386 |editor-last=Kistler |editor-first=Werner M. |url=https://www.cambridge.org/core/books/abs/spiking-neuron-models/hebbian-models/214CE8710738E51AC1E73EC3830429A4 |access-date=2025-03-19 |place=Cambridge |publisher=Cambridge University Press |doi=10.1017/CBO9780511815706.011 |isbn=978-0-521-89079-3 |editor2-last=Gerstner |editor2-first=Wulfram|url-access=subscription }}</ref> Current studies in [[artificial intelligence]] (AI) and quantum computing continue to leverage Hebbian concepts for developing adaptive algorithms and improving machine learning models.<ref>Bi, G.-Q., & Poo, M.-M. (1998). Synaptic modifications in cultured hippocampal neurons: Dependence on spike timing, synaptic strength, and postsynaptic cell type. *Journal of Neuroscience*, 18(24), 10464–10472.</ref> In AI, Hebbian learning has seen applications beyond traditional neural networks. One significant advancement is in reinforcement learning algorithms, where Hebbian-like learning is used to update the weights based on the timing and strength of stimuli during training phases. Some researchers have adapted Hebbian principles to develop more biologically plausible models for learning in artificial systems, which may improve model efficiency and convergence in AI applications. <ref>Oja, E. (1982). A simplified neuron model as a principal component analyzer. *Journal of Mathematical Biology*, 15(3), 267–273.</ref> <ref>Rumelhart, D. E., & McClelland, J. L. (1986). *Parallel Distributed Processing: Explorations in the Microstructure of Cognition*. MIT Press.</ref> A growing area of interest is the application of Hebbian learning in quantum computing. While classical neural networks are the primary area of application for Hebbian theory, recent studies have begun exploring the potential for quantum-inspired algorithms. These algorithms leverage the principles of quantum superposition and entanglement to enhance learning processes in quantum systems.<ref>Huang, H., & Li, Y. (2019). A Quantum-Inspired Hebbian Learning Algorithm for Neural Networks. *Journal of Quantum Information Science*, 9(2), 111-124.</ref>Current research is exploring how Hebbian principles could inform the development of more efficient quantum machine learning models.<ref name=":2" /> New computational models have emerged that refine or extend Hebbian learning. For example, some models now account for the precise timing of neural spikes (as in spike-timing-dependent plasticity), while others have integrated aspects of neuromodulation to account for how neurotransmitters like dopamine affect the strength of synaptic connections. These advanced models provide a more nuanced understanding of how Hebbian learning operates in the brain and are contributing to the development of more realistic computational models. <ref>Miller, P., & Conver, A. (2012). Computational models of synaptic plasticity and learning. *Current Opinion in Neurobiology*, 22(5), 648-655.</ref> <ref>Béïque, J. C., & Andrade, R. (2012). Neuromodulation of synaptic plasticity in the hippocampus: Implications for learning and memory. *Frontiers in Synaptic Neuroscience*, 4, 15.</ref> Recent research on Hebbian learning has focused on the role of inhibitory neurons, which are often overlooked in traditional Hebbian models. While classic Hebbian theory primarily focuses on excitatory neurons, more comprehensive models of neural learning now consider the balanced interaction between excitatory and inhibitory synapses. Studies suggest that inhibitory neurons can provide critical regulation for maintaining stability in neural circuits and might prevent runaway positive feedback in Hebbian learning.<ref>Markram, H., Lübke, J., Frotscher, M., & Sakmann, B. (1997). Regulation of synaptic efficacy by coincidence of postsynaptic APs and EPSPs. *Science*, 275(5297), 213–215.</ref><ref>Cohen, M. R., & Kohn, A. (2011). Measuring and interpreting neuronal correlations. *Nature Neuroscience*, 14(7), 811-819.</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)