Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Boltzmann machine
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Restricted Boltzmann machine=== [[File:Restricted Boltzmann machine.svg|thumb|right|alt=Graphical representation of an example restricted Boltzmann machine |Graphical representation of a restricted Boltzmann machine. The four blue units represent hidden units, and the three red units represent visible states. In restricted Boltzmann machines there are only connections (dependencies) between hidden and visible units, and none between units of the same type (no hidden-hidden, nor visible-visible connections).]] {{Main|Restricted Boltzmann machine}} Although learning is impractical in general Boltzmann machines, it can be made quite efficient in a restricted Boltzmann machine (RBM) which does not allow intralayer connections between hidden units and visible units, i.e. there is no connection between visible to visible and hidden to hidden units. After training one RBM, the activities of its hidden units can be treated as data for training a higher-level RBM. This method of stacking RBMs makes it possible to train many layers of hidden units efficiently and is one of the most common [[deep learning]] strategies. As each new layer is added the generative model improves. An extension to the restricted Boltzmann machine allows using real valued data rather than binary data.<ref>{{Citation|title=Recent Developments in Deep Learning| date=22 March 2010 |url=https://www.youtube.com/watch?v=VdIURAu1-aU |archive-url=https://ghostarchive.org/varchive/youtube/20211222/VdIURAu1-aU |archive-date=2021-12-22 |url-status=live|language=en|access-date=2020-02-17}}{{cbignore}}</ref> One example of a practical RBM application is in speech recognition.<ref>{{cite journal |url=http://research.microsoft.com/pubs/144412/DBN4LVCSR-TransASLP.pdf |title=Context-Dependent Pre-trained Deep Neural Networks for Large Vocabulary Speech Recognition |journal=Microsoft Research |volume=20 |year=2011|last1=Yu |first1=Dong |last2=Dahl |first2=George |last3=Acero |first3=Alex |last4=Deng |first4=Li }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)