Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Markov random field
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Set of random variables}} [[File:markov random field example.png|thumb|alt=An example of a Markov random field.|An example of a Markov random field. Each edge represents dependency. In this example: A depends on B and D. B depends on A and D. D depends on A, B, and E. E depends on D and C. C depends on E.]] In the domain of [[physics]] and [[probability]], a '''Markov random field''' ('''MRF'''), '''Markov network''' or '''undirected [[graphical model]]''' is a set of [[random variable]]s having a [[Markov property]] described by an [[undirected graph]]. In other words, a [[random field]] is said to be a [[Andrey Markov|Markov]] random field if it satisfies Markov properties. The concept originates from the [[Spin glass#Sherrington–Kirkpatrick model|Sherrington–Kirkpatrick model]].<ref>{{citation|title=Solvable Model of a Spin-Glass|number=35|year=1975|author1= Sherrington, David|author2=Kirkpatrick, Scott|journal=Physical Review Letters|volume=35|pages=1792–1796|doi=10.1103/PhysRevLett.35.1792|bibcode=1975PhRvL..35.1792S}}</ref> A Markov network or MRF is similar to a [[Bayesian network]] in its representation of dependencies; the differences being that Bayesian networks are [[directed acyclic graph|directed and acyclic]], whereas Markov networks are undirected and may be cyclic. Thus, a Markov network can represent certain dependencies that a Bayesian network cannot (such as cyclic dependencies {{Explain|date=July 2018}}); on the other hand, it can't represent certain dependencies that a Bayesian network can (such as induced dependencies {{Explain|date=July 2018}}). The underlying graph of a Markov random field may be finite or infinite. When the [[joint probability distribution|joint probability density]] of the random variables is strictly positive, it is also referred to as a '''Gibbs random field''', because, according to the [[Hammersley–Clifford theorem]], it can then be represented by a [[Gibbs measure]] for an appropriate (locally defined) energy function. The prototypical Markov random field is the [[Ising model]]; indeed, the Markov random field was introduced as the general setting for the Ising model.<ref name="Kindermann-Snell80">{{cite book |first1=Ross |last1=Kindermann |first2=J. Laurie |last2=Snell |url=http://www.cmap.polytechnique.fr/~rama/ehess/mrfbook.pdf |title=Markov Random Fields and Their Applications |year=1980 |publisher=American Mathematical Society |isbn=978-0-8218-5001-5 |mr=0620955 |access-date=2012-04-09 |archive-date=2017-08-10 |archive-url=https://web.archive.org/web/20170810092327/http://www.cmap.polytechnique.fr/%7Erama/ehess/mrfbook.pdf |url-status=dead }}</ref> In the domain of [[artificial intelligence]], a Markov random field is used to model various low- to mid-level tasks in [[image processing]] and [[computer vision]].<ref>{{cite book |first1=S. Z. |last1=Li |title=Markov Random Field Modeling in Image Analysis |year=2009 |publisher=Springer |url=https://books.google.com/books?id=rDsObhDkCIAC |isbn=9781848002791 }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)