Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Causal Markov condition
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
The '''Markov condition''', sometimes called the '''Markov assumption''', is an assumption made in [[Bayesian probability theory]], that every node in a [[Bayesian network]] is [[conditionally independent]] of its nondescendants, given its parents. Stated loosely, it is assumed that a node has no bearing on nodes which do not descend from it. In a [[Directed acyclic graph|DAG]], this local Markov condition is equivalent to the global Markov condition, which states that [[Bayesian network#d-separation|d-separations]] in the graph also correspond to conditional independence relations.<ref>{{cite journal |last1=Geiger |first1=Dan |last2=Pearl |first2=Judea |title=On the Logic of Causal Models |journal=Machine Intelligence and Pattern Recognition |date=1990 |volume=9 |pages=3β14 |doi=10.1016/b978-0-444-88650-7.50006-8}}</ref><ref>{{cite journal |last1=Lauritzen |first1=S. L. |last2=Dawid |first2=A. P. |last3=Larsen |first3=B. N. |last4=Leimer |first4=H.-G. |title=Independence properties of directed markov fields |journal=Networks |date=August 1990 |volume=20 |issue=5 |pages=491β505 |doi=10.1002/net.3230200503}}</ref> This also means that a node is conditionally independent of the entire network, given its [[Markov blanket]]. The related '''Causal Markov (CM) condition''' states that, conditional on the set of all its direct causes, a node is independent of all variables which are not effects or direct causes of that node.<ref name=":0">{{cite journal |last1=Hausman |first1=D.M. |last2=Woodward |first2=J. |title=Independence, Invariance, and the Causal Markov Condition |journal=British Journal for the Philosophy of Science |volume=50 |issue=4 |pages=521β583 |date=December 1999 |doi= 10.1093/bjps/50.4.521|url=http://philosophy.wisc.edu/hausman/papers/bjps.pdf }}</ref> In the event that the structure of a Bayesian network accurately depicts [[causality]], the two conditions are equivalent. However, a network may accurately embody the Markov condition without depicting causality, in which case it should not be assumed to embody the causal Markov condition. == Motivation == {{Main|Probabilistic causation}} Statisticians are enormously interested in the ways in which certain events and variables are connected. The precise notion of what constitutes a cause and effect is necessary to understand the connections between them. The central idea behind the philosophical study of probabilistic causation is that causes raise the probabilities of their effects, [[all else being equal]]. A [[Causal Determinism|deterministic]] interpretation of causation means that if ''A'' causes ''B'', then ''A'' must ''always'' be followed by ''B''. In this sense, smoking does not cause cancer because some smokers never develop cancer. On the other hand, a [[Probabilistic causation|probabilistic]] interpretation simply means that causes raise the probability of their effects. In this sense, changes in meteorological readings associated with a storm do cause that storm, since they raise its probability. (However, simply looking at a barometer does not change the probability of the storm, for a more detailed analysis, see:<ref>{{Cite book|title=Causality|last=Pearl|first=Judea|date=2009|publisher=Cambridge University Press|isbn=9780511803161|location=Cambridge|doi = 10.1017/cbo9780511803161}}</ref>). == Implications == {{Expand section|date=July 2019}} === Dependence and Causation === It follows from the definition that if ''X'' and ''Y'' are in '''V''' and are probabilistically dependent, then either ''X'' causes ''Y'', ''Y'' causes ''X'', or ''X'' and ''Y'' are both effects of some common cause ''Z'' in '''V'''.<ref name=":0" /> This definition was seminally introduced by Hans Reichenbach as the Common Cause Principle (CCP) <ref>{{Cite book|title=The Direction of Time|last=Reichenbach|first=Hans|date=1956|publisher=University of California Press|isbn=9780486409269|location=Los Angeles}}</ref> === Screening === It once again follows from the definition that the parents of ''X'' screen ''X'' from other "indirect causes" of ''X'' (parents of Parents(''X'')) and other effects of Parents(''X'') which are not also effects of ''X''.<ref name=":0" /> == Examples == In a simple view, releasing one's hand from a hammer causes the hammer to fall. However, doing so in outer space does not produce the same outcome, calling into question if releasing one's fingers from a hammer ''always'' causes it to fall. A causal graph could be created to acknowledge that both the presence of gravity and the release of the hammer contribute to its falling. However, it would be very surprising if the surface underneath the hammer affected its falling. This essentially states the Causal Markov Condition, that given the existence of gravity the release of the hammer, it will fall regardless of what is beneath it. == See also == * [[Causal model]] == Notes== {{reflist}} {{DEFAULTSORT:Causal Markov Condition}} [[Category:Bayesian networks]] [[Category:Causality]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Expand section
(
edit
)
Template:Main
(
edit
)
Template:Reflist
(
edit
)