Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Markov blanket
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Subset of variables that contains all the useful information}} [[Image:Diagram of a Markov blanket.svg|frame|In a [[Bayesian network]], the Markov boundary of node ''A'' includes its parents, children and the other parents of all of its children.]] In [[statistics]] and [[machine learning]], when one wants to infer a random variable with a set of variables, usually a subset is enough, and other variables are useless. Such a subset that contains all the useful information is called a '''Markov blanket'''. If a Markov blanket is minimal, meaning that it cannot drop any variable without losing information, it is called a '''Markov boundary'''. Identifying a Markov blanket or a Markov boundary helps to extract useful features. The terms of Markov blanket and Markov boundary were coined by [[Judea Pearl]] in 1988.<ref>{{cite book |last=Pearl |first=Judea |authorlink=Judea Pearl |title=Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference |publisher=Morgan Kaufmann |location=San Mateo CA |year=1988 |isbn=0-934613-73-7 |series=Representation and Reasoning Series |url-access=registration |url=https://archive.org/details/probabilisticrea00pear }}</ref> A Markov blanket can be constituted by a set of [[Markov chain]]s.<!--[[Markov chain#Testing]]--> == Markov blanket == A Markov blanket of a random variable <math>Y</math> in a random variable set <math>\mathcal{S}=\{X_1,\ldots,X_n\}</math> is any subset <math>\mathcal{S}_1</math> of <math>\mathcal{S}</math>, conditioned on which other variables are independent with <math>Y</math>: <math display="block">Y\perp \!\!\! \perp\mathcal{S}\backslash\mathcal{S}_1 \mid \mathcal{S}_1.</math> It means that <math>\mathcal{S}_1</math> contains at least all the information one needs to infer <math>Y</math>, where the variables in <math>\mathcal{S}\backslash\mathcal{S}_1</math> are redundant. In general, a given Markov blanket is not unique. Any set in <math>\mathcal{S}</math> that contains a Markov blanket is also a Markov blanket itself. Specifically, <math>\mathcal{S}</math> is a Markov blanket of <math>Y</math> in <math>\mathcal{S}</math>. == Markov boundary == A '''Markov boundary''' of <math>Y</math> in <math>\mathcal{S}</math> is a subset <math>\mathcal{S}_2</math> of <math>\mathcal{S}</math>, such that <math>\mathcal{S}_2</math> itself is a Markov blanket of <math>Y</math>, but any proper subset of <math>\mathcal{S}_2</math> is not a Markov blanket of <math>Y</math>. In other words, a Markov boundary is a minimal Markov blanket. The Markov boundary of a [[Vertex (graph theory)|node]] <math>A</math> in a [[Bayesian network]] is the set of nodes composed of <math>A</math>'s parents, <math>A</math>'s children, and <math>A</math>'s children's other parents. In a [[Markov random field]], the Markov boundary for a node is the set of its neighboring nodes. In a [[Dependency network (graphical model)|dependency network]], the Markov boundary for a node is the set of its parents. === Uniqueness of Markov boundary === The Markov boundary always exists. Under some mild conditions, the Markov boundary is unique. However, for most practical and theoretical scenarios multiple Markov boundaries may provide alternative solutions.<ref>{{cite journal |last1=Statnikov |first1=Alexander |last2=Lytkin |first2=Nikita I. |last3=Lemeire |first3=Jan |last4=Aliferis |first4=Constantin F. |title=Algorithms for discovery of multiple Markov boundaries |journal=Journal of Machine Learning Research |date=2013 |volume=14 |pages=499-566 |url=http://www.jmlr.org/papers/volume14/statnikov13a/statnikov13a.pdf}}</ref> When there are multiple Markov boundaries, quantities measuring causal effect could fail.<ref>{{cite journal |last1=Wang |first1=Yue |last2=Wang |first2=Linbo |title=Causal inference in degenerate systems: An impossibility result |journal=Proceedings of the 23rd International Conference on Artificial Intelligence and Statistics |date=2020 |pages=3383-3392 |url=http://proceedings.mlr.press/v108/wang20i.html}}</ref> == See also == * [[Andrey Markov]] * [[Free_energy_principle#Free energy minimisation|Free energy minimisation]] * [[Moral graph]] * [[Separation of concerns]] * [[Causality]] * [[Causal inference]] ==Notes== <references/> [[Category:Bayesian networks]] [[Category:Markov networks]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Short description
(
edit
)