Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Expectation–maximization algorithm
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Alternatives == EM typically converges to a local optimum, not necessarily the global optimum, with no bound on the convergence rate in general. It is possible that it can be arbitrarily poor in high dimensions and there can be an exponential number of local optima. Hence, a need exists for alternative methods for guaranteed learning, especially in the high-dimensional setting. Alternatives to EM exist with better guarantees for consistency, which are termed ''moment-based approaches''<ref>{{Cite journal|last=Pearson|first=Karl|date=1894|title=Contributions to the Mathematical Theory of Evolution|journal=Philosophical Transactions of the Royal Society of London A|volume=185|pages=71–110|issn=0264-3820|jstor=90667|doi=10.1098/rsta.1894.0003|bibcode=1894RSPTA.185...71P|doi-access=free}}</ref> or the so-called ''spectral techniques''.<ref>{{Cite journal|last1=Shaban|first1=Amirreza|last2=Mehrdad|first2=Farajtabar|last3=Bo|first3=Xie|last4=Le|first4=Song|last5=Byron|first5=Boots|date=2015|title=Learning Latent Variable Models by Improving Spectral Solutions with Exterior Point Method|url=https://www.cc.gatech.edu/~bboots3/files/SpectralExteriorPoint-NIPSWorkshop.pdf|journal=UAI|pages=792–801|access-date=2019-06-12|archive-date=2016-12-24|archive-url=https://web.archive.org/web/20161224102320/https://www.cc.gatech.edu/~bboots3/files/SpectralExteriorPoint-NIPSWorkshop.pdf|url-status=dead}}</ref><ref>{{Cite book|title=Local Loss Optimization in Operator Models: A New Insight into Spectral Learning|last=Balle, Borja Quattoni, Ariadna Carreras, Xavier|date=2012-06-27|oclc=815865081}}</ref> Moment-based approaches to learning the parameters of a probabilistic model enjoy guarantees such as global convergence under certain conditions unlike EM which is often plagued by the issue of getting stuck in local optima. Algorithms with guarantees for learning can be derived for a number of important models such as mixture models, HMMs etc. For these spectral methods, no spurious local optima occur, and the true parameters can be consistently estimated under some regularity conditions.{{Citation needed|date=April 2019}}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)