Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Conditional independence
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Probability theory concept}} {{see also|Conditional dependence}} {{Probability fundamentals}} In [[probability theory]], '''conditional independence''' describes situations wherein an observation is irrelevant or redundant when evaluating the certainty of a hypothesis. Conditional independence is usually formulated in terms of [[conditional probability]], as a special case where the probability of the hypothesis given the uninformative observation is equal to the probability without. If <math>A</math> is the hypothesis, and <math>B</math> and <math>C</math> are observations, conditional independence can be stated as an equality: :<math>P(A\mid B,C) = P(A \mid C)</math> where <math>P(A \mid B, C)</math> is the probability of <math>A</math> given both <math>B</math> and <math>C</math>. Since the probability of <math>A</math> given <math>C</math> is the same as the probability of <math>A</math> given both <math>B</math> and <math>C</math>, this equality expresses that <math>B</math> contributes nothing to the certainty of <math>A</math>. In this case, <math>A</math> and <math>B</math> are said to be '''conditionally independent''' given <math>C</math>, written symbolically as: <math>(A \perp\!\!\!\perp B \mid C)</math>. The concept of conditional independence is essential to graph-based theories of statistical inference, as it establishes a mathematical relation between a collection of conditional statements and a [[graphoid]]. ==Conditional independence of events== Let <math>A</math>, <math>B</math>, and <math>C</math> be [[Event (probability theory)|events]]. <math>A</math> and <math>B</math> are said to be '''conditionally independent''' given <math>C</math> if and only if <math>P(C) > 0</math> and: :<math>P(A \mid B, C) = P(A \mid C)</math> This property is often written: <math>(A \perp\!\!\!\perp B \mid C)</math>, which should be read <math>((A \perp\!\!\!\perp B) \vert C)</math>. Equivalently, conditional independence may be stated as: :<math>P(A,B|C) = P(A|C)P(B|C)</math> where <math>P(A,B|C)</math> is the [[joint probability]] of <math>A</math> and <math>B</math> given <math>C</math>. This alternate formulation states that <math>A</math> and <math>B</math> are [[Independence (probability theory)|independent events]], '''given''' <math>C</math>. It demonstrates that <math>(A \perp\!\!\!\perp B \mid C)</math> is equivalent to <math>(B \perp\!\!\!\perp A \mid C)</math>. === Proof of the equivalent definition === :<math>P(A, B \mid C) = P(A\mid C)P(B\mid C)</math> :iff <math>\frac{P(A, B, C)}{P(C)} = \left(\frac{P(A, C)}{P(C)}\right) \left(\frac{P(B, C)}{P(C)} \right)</math> (definition of [[conditional probability]]) :iff <math>P(A, B, C) = \frac{P(A, C) P(B, C)}{P(C)}</math> (multiply both sides by <math>P(C)</math>) :iff <math>\frac{P(A, B, C)}{P(B, C)}= \frac{P(A, C)}{P(C)}</math> (divide both sides by <math>P(B, C)</math>) :iff <math>P(A \mid B, C) = P(A \mid C)</math> (definition of conditional probability) <math>\therefore</math> ===Examples=== ==== Coloured boxes ==== Each cell represents a possible outcome. The events <math>\color{red}R</math>, <math>\color{blue}B</math> and <math>\color{gold}Y</math> are represented by the areas shaded {{font color|red|red}}, {{font color|blue|blue}} and {{font color|gold|yellow}} respectively. The overlap between the events <math>\color{red}R</math> and <math>\color{blue}B</math> is shaded {{font color|purple|purple}}. [[Image:Conditional independence.svg|450px|These are two examples illustrating '''conditional independence'''.]] The probabilities of these events are shaded areas with respect to the total area. In both examples <math>\color{red}R</math> and <math>\color{blue}B</math> are conditionally independent given <math>\color{gold}Y</math> because: :<math>\Pr({\color{red}R}, {\color{blue}B} \mid {\color{gold}Y}) = \Pr({\color{red}R} \mid {\color{gold}Y})\Pr({\color{blue}B} \mid {\color{gold}Y})</math><ref>To see that this is the case, one needs to realise that Pr(''R'' β© ''B'' | ''Y'') is the probability of an overlap of ''R'' and ''B'' (the purple shaded area) in the ''Y'' area. Since, in the picture on the left, there are two squares where ''R'' and ''B'' overlap within the ''Y'' area, and the ''Y'' area has twelve squares, Pr(''R'' β© ''B'' | ''Y'') = {{sfrac|2|12}} = {{sfrac|1|6}}. Similarly, Pr(''R'' | ''Y'') = {{sfrac|4|12}} = {{sfrac|1|3}} and Pr(''B'' | ''Y'') = {{sfrac|6|12}} = {{sfrac|1|2}}.</ref> but not conditionally independent given <math>\left[ \text{not }{\color{gold}Y}\right]</math> because: :<math>\Pr({\color{red}R}, {\color{blue}B} \mid \text{not } {\color{gold}Y}) \not= \Pr({\color{red}R} \mid \text{not } {\color{gold}Y})\Pr({\color{blue}B} \mid \text{not } {\color{gold}Y})</math> ==== Proximity and delays ==== Let events A and B be defined as the probability that person A and person B will be home in time for dinner where both people are randomly sampled from the entire world. Events A and B can be assumed to be independent i.e. knowledge that A is late has minimal to no change on the probability that B will be late. However, if a third event is introduced, person A and person B live in the same neighborhood, the two events are now considered not conditionally independent. Traffic conditions and weather-related events that might delay person A, might delay person B as well. Given the third event and knowledge that person A was late, the probability that person B will be late does meaningfully change.<ref name=":0">[https://math.stackexchange.com/q/23093 Could someone explain conditional independence?]</ref> ==== Dice rolling ==== Conditional independence depends on the nature of the third event. If you roll two dice, one may assume that the two dice behave independently of each other. Looking at the results of one die will not tell you about the result of the second die. (That is, the two dice are independent.) If, however, the 1st die's result is a 3, and someone tells you about a third event - that the sum of the two results is even - then this extra unit of information restricts the options for the 2nd result to an odd number. In other words, two events can be independent, but NOT conditionally independent.<ref name=":0" /> ====Height and vocabulary==== Height and vocabulary are dependent since very small people tend to be children, known for their more basic vocabularies. But knowing that two people are 19 years old (i.e., conditional on age) there is no reason to think that one person's vocabulary is larger if we are told that they are taller. ==Conditional independence of random variables== Two discrete [[random variable]]s <math>X</math> and <math>Y</math> are conditionally independent given a third discrete random variable <math>Z</math> if and only if they are [[Independence (probability theory)|independent]] in their [[conditional probability distribution]] given <math>Z</math>. That is, <math>X</math> and <math>Y</math> are conditionally independent given <math>Z</math> if and only if, given any value of <math>Z</math>, the probability distribution of <math>X</math> is the same for all values of <math>Y</math> and the probability distribution of <math>Y</math> is the same for all values of <math>X</math>. Formally: {{Equation box 1 |indent = |title= |equation = {{NumBlk||<math>(X \perp\!\!\!\perp Y) \mid Z \quad \iff \quad F_{X,Y\,\mid\,Z\,=\,z}(x,y) = F_{X\,\mid\,Z\,=\,z}(x) \cdot F_{Y\,\mid\,Z\,=\,z}(y) \quad \text{for all } x,y,z</math>|{{EquationRef|Eq.2}}}} |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} where <math>F_{X,Y\,\mid\,Z\,=\,z}(x,y)=\Pr(X \leq x, Y \leq y \mid Z=z)</math> is the conditional [[cumulative distribution function]] of <math>X</math> and <math>Y</math> given <math>Z</math>. Two events <math>R</math> and <math>B</math> are conditionally independent given a [[sigma-algebra|σ-algebra]] <math>\Sigma</math> if :<math>\Pr(R, B \mid \Sigma) = \Pr(R \mid \Sigma)\Pr(B \mid \Sigma) \text{ a.s.}</math> where <math>\Pr(A \mid \Sigma) </math> denotes the [[conditional expectation]] of the [[indicator function]] of the event <math>A</math>, <math>\chi_A</math>, given the sigma algebra <math>\Sigma</math>. That is, :<math>\Pr(A \mid \Sigma) := \operatorname{E}[\chi_A\mid\Sigma].</math> Two random variables <math>X</math> and <math>Y</math> are conditionally independent given a Ο-algebra <math>\Sigma</math> if the above equation holds for all <math>R</math> in <math>\sigma(X)</math> and <math>B</math> in <math>\sigma(Y)</math>. Two random variables <math>X</math> and <math>Y</math> are conditionally independent given a random variable <math>W</math> if they are independent given ''Ο''(''W''): the Ο-algebra generated by <math>W</math>. This is commonly written: :<math>X \perp\!\!\!\perp Y \mid W </math> or :<math>X \perp Y \mid W</math> This is read "<math>X</math> is independent of <math>Y</math>, '''given''' <math>W</math>"; the conditioning applies to the whole statement: "(<math>X</math> is independent of <math>Y</math>) given <math>W</math>". :<math>(X \perp\!\!\!\perp Y) \mid W</math> This notation extends <math>X \perp\!\!\!\perp Y</math> for "<math>X</math> is [[Independence (probability theory)|independent]] of <math>Y</math>." If <math>W</math> assumes a countable set of values, this is equivalent to the conditional independence of ''X'' and ''Y'' for the events of the form <math>[W=w]</math>. Conditional independence of more than two events, or of more than two random variables, is defined analogously. The following two examples show that <math>X \perp\!\!\!\perp Y</math> ''neither implies nor is implied by'' <math>(X \perp\!\!\!\perp Y) \mid W</math>. First, suppose <math>W</math> is 0 with probability 0.5 and 1 otherwise. When ''W'' = 0 take <math>X</math> and <math>Y</math> to be independent, each having the value 0 with probability 0.99 and the value 1 otherwise. When <math>W=1</math>, <math>X</math> and <math>Y</math> are again independent, but this time they take the value 1 with probability 0.99. Then <math>(X \perp\!\!\!\perp Y) \mid W</math>. But <math>X</math> and <math>Y</math> are dependent, because Pr(''X'' = 0) < Pr(''X'' = 0|''Y'' = 0). This is because Pr(''X'' = 0) = 0.5, but if ''Y'' = 0 then it's very likely that ''W'' = 0 and thus that ''X'' = 0 as well, so Pr(''X'' = 0|''Y'' = 0) > 0.5. For the second example, suppose <math>X \perp\!\!\!\perp Y</math>, each taking the values 0 and 1 with probability 0.5. Let <math>W</math> be the product <math>X \cdot Y</math>. Then when <math>W=0</math>, Pr(''X'' = 0) = 2/3, but Pr(''X'' = 0|''Y'' = 0) = 1/2, so <math>(X \perp\!\!\!\perp Y) \mid W</math> is false. This is also an example of Explaining Away. See Kevin Murphy's tutorial <ref>{{Cite web|url=http://people.cs.ubc.ca/~murphyk/Bayes/bnintro.html|title=Graphical Models}}</ref> where <math>X</math> and <math>Y</math> take the values "brainy" and "sporty". ==Conditional independence of random vectors== Two [[random vector]]s <math>\mathbf{X}=(X_1,\ldots,X_l)^{\mathrm T}</math> and <math>\mathbf{Y}=(Y_1,\ldots,Y_m)^{\mathrm T}</math> are conditionally independent given a third random vector <math>\mathbf{Z}=(Z_1,\ldots,Z_n)^{\mathrm T}</math> if and only if they are independent in their conditional cumulative distribution given <math>\mathbf{Z}</math>. Formally: {{Equation box 1 |indent = |title= |equation = {{NumBlk||<math>(\mathbf{X} \perp\!\!\!\perp \mathbf{Y}) \mid \mathbf{Z} \quad \iff \quad F_{\mathbf{X},\mathbf{Y}|\mathbf{Z}=\mathbf{z}}(\mathbf{x},\mathbf{y}) = F_{\mathbf{X}\,\mid\,\mathbf{Z}\,=\,\mathbf{z}}(\mathbf{x}) \cdot F_{\mathbf{Y}\,\mid\,\mathbf{Z}\,=\,\mathbf{z}}(\mathbf{y}) \quad \text{for all } \mathbf{x},\mathbf{y},\mathbf{z}</math>|{{EquationRef|Eq.3}}}} |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} where <math>\mathbf{x}=(x_1,\ldots,x_l)^{\mathrm T}</math>, <math>\mathbf{y}=(y_1,\ldots,y_m)^{\mathrm T}</math> and <math>\mathbf{z}=(z_1,\ldots,z_n)^{\mathrm T}</math> and the conditional cumulative distributions are defined as follows. : <math>\begin{align} F_{\mathbf{X},\mathbf{Y}\,\mid\,\mathbf{Z}\,=\,\mathbf{z}}(\mathbf{x},\mathbf{y}) &= \Pr(X_1 \leq x_1,\ldots,X_l \leq x_l, Y_1 \leq y_1,\ldots,Y_m \leq y_m \mid Z_1=z_1,\ldots,Z_n=z_n) \\[6pt] F_{\mathbf{X}\,\mid\,\mathbf{Z}\,=\,\mathbf{z}}(\mathbf{x}) &= \Pr(X_1 \leq x_1,\ldots,X_l \leq x_l \mid Z_1=z_1,\ldots,Z_n=z_n) \\[6pt] F_{\mathbf{Y}\,\mid\,\mathbf{Z}\,=\,\mathbf{z}}(\mathbf{y}) &= \Pr(Y_1 \leq y_1,\ldots,Y_m \leq y_m \mid Z_1=z_1,\ldots,Z_n=z_n) \end{align}</math> ==Uses in Bayesian inference== Let ''p'' be the proportion of voters who will vote "yes" in an upcoming [[referendum]]. In taking an [[opinion poll]], one chooses ''n'' voters randomly from the population. For ''i'' = 1, ..., ''n'', let ''X''<sub>''i''</sub> = 1 or 0 corresponding, respectively, to whether or not the ''i''th chosen voter will or will not vote "yes". In a [[frequency probability|frequentist]] approach to [[statistical inference]] one would not attribute any probability distribution to ''p'' (unless the probabilities could be somehow interpreted as relative frequencies of occurrence of some event or as proportions of some population) and one would say that ''X''<sub>1</sub>, ..., ''X''<sub>''n''</sub> are [[statistical independence|independent]] random variables. By contrast, in a [[Bayesian inference|Bayesian]] approach to statistical inference, one would assign a [[probability distribution]] to ''p'' regardless of the non-existence of any such "frequency" interpretation, and one would construe the probabilities as degrees of belief that ''p'' is in any interval to which a probability is assigned. In that model, the random variables ''X''<sub>1</sub>, ..., ''X''<sub>''n''</sub> are ''not'' independent, but they are '''conditionally independent''' given the value of ''p''. In particular, if a large number of the ''X''s are observed to be equal to 1, that would imply a high [[conditional probability]], given that observation, that ''p'' is near 1, and thus a high [[conditional probability]], given that observation, that the ''next'' ''X'' to be observed will be equal to 1. ==Rules of conditional independence== A set of rules governing statements of conditional independence have been derived from the basic definition.<ref>{{cite journal |first=A. P. |last=Dawid |authorlink=Philip Dawid |title=Conditional Independence in Statistical Theory |journal=[[Journal of the Royal Statistical Society, Series B]] |year=1979 |volume=41 |issue=1 |pages=1β31 |mr=0535541 |jstor=2984718 }}</ref><ref name=pearl:2000>J Pearl, Causality: Models, Reasoning, and Inference, 2000, Cambridge University Press</ref> These rules were termed "[[Graphoid]] Axioms" by Pearl and Paz,<ref name=pearl:paz85>{{cite conference | last1 = Pearl | first1 = Judea | author1-link = Judea Pearl | last2 = Paz | first2 = Azaria | editor1-last = du Boulay | editor1-first = Benedict | editor2-last = Hogg | editor2-first = David C. | editor3-last = Steels | editor3-first = Luc | contribution = Graphoids: Graph-Based Logic for Reasoning about Relevance Relations or When would x tell you more about y if you already know z? | pages = 357β363 | publisher = North-Holland | title = Advances in Artificial Intelligence II, Seventh European Conference on Artificial Intelligence, ECAI 1986, Brighton, UK, July 20β25, 1986, Proceedings | url = https://ftp.cs.ucla.edu/pub/stat_ser/r53-L.pdf | year = 1986}}</ref> because they hold in graphs, where <math>X \perp\!\!\!\perp A\mid B</math> is interpreted to mean: "All paths from ''X'' to ''A'' are intercepted by the set ''B''".<ref name=pearl:88>{{cite book|last1=Pearl|first1=Judea|title=Probabilistic reasoning in intelligent systems: networks of plausible inference|url=https://archive.org/details/probabilisticrea00pear|url-access=registration|date=1988|publisher=Morgan Kaufmann|isbn=9780934613736}}</ref> ===Symmetry=== : <math> X \perp\!\!\!\perp Y \mid Z \quad \Leftrightarrow \quad Y \perp\!\!\!\perp X \mid Z </math> '''Proof:''' From the definition of conditional independence, : <math> X \perp\!\!\!\perp Y \mid Z \quad \Leftrightarrow \quad P(X, Y \mid Z) = P(X \mid Z) P(Y \mid Z) \quad \Leftrightarrow \quad Y \perp\!\!\!\perp X \mid Z </math> ===Decomposition=== : <math> X \perp\!\!\!\perp Y \mid Z \quad \Rightarrow \quad h(X) \perp\!\!\!\perp Y \mid Z </math> '''Proof''' From the definition of conditional independence, we seek to show that: : <math> X \perp\!\!\!\perp Y \mid Z \quad \Rightarrow \quad P(h(X), Y \mid Z) = P(h(X) \mid Z) P(Y \mid Z) </math> . The left side of this equality is: : <math> P(h(X)=a, Y=y \mid Z=z) = \sum_{X \colon h(X)=a} P(X=x, Y=y \mid Z=z) </math> , where the expression on the right side of this equality is the summation over <math>X</math> such that <math>h(X)=a</math> of the conditional probability of <math>X, Y</math> on <math>Z</math>. Further decomposing, : <math> \begin{align} \sum_{X \colon h(X)=a} P(X=x, Y=y \mid Z=z) =& \sum_{X \colon h(X)=a} P(X=x \mid Z=z) P(Y=y \mid Z=z) \\ =& P(Y=y \mid Z=z) \sum_{X \colon h(X)=a} P(X=x \mid Z=z) \\ =& P(Y \mid Z) P (h(X) \mid Z) \end{align} </math> . Special cases of this property include * <math> (X, W) \perp\!\!\!\perp Y \mid Z \quad \Rightarrow \quad X \perp\!\!\!\perp Y \mid Z </math> ** '''Proof:''' Let us define <math> A = (X,W) </math> and <math> h(\cdot) </math> be an 'extraction' function <math> h(X,W) = X</math>. Then: : <math> \begin{align} (X,W) \perp\!\!\!\perp Y \mid Z \quad &\Leftrightarrow \quad A \perp\!\!\!\perp Y \mid Z \\ &\Rightarrow \quad h(A) \perp\!\!\!\perp Y \mid Z \quad &\text{Decomposition} \\ &\Leftrightarrow \quad X \perp\!\!\!\perp Y \mid Z \end{align} </math> * <math> X \perp\!\!\!\perp (Y, W) \mid Z \quad \Rightarrow \quad X \perp\!\!\!\perp Y \mid Z </math> ** '''Proof:''' Let us define <math> V = (Y,W) </math> and <math> h(\cdot) </math> be again an 'extraction' function <math> h(Y,W) = Y</math>. Then: : <math> \begin{align} X \perp\!\!\!\perp (Y,W) \mid Z \quad &\Leftrightarrow \quad X \perp\!\!\!\perp V \mid Z \\ &\Leftrightarrow \quad V \perp\!\!\!\perp X \mid Z \quad &\text{Symmetry} \\ &\Rightarrow \quad h(V) \perp\!\!\!\perp X \mid Z \quad &\text{Decomposition} \\ &\Leftrightarrow \quad Y \perp\!\!\!\perp X \mid Z \\ &\Leftrightarrow \quad X \perp\!\!\!\perp Y \mid Z \quad &\text{Symmetry} \end{align} </math> ===Weak union=== : <math> X \perp\!\!\!\perp Y \mid Z \quad \Rightarrow \quad X \perp\!\!\!\perp Y \mid (Z, h(X)) </math> '''Proof:''' Given <math> X \perp\!\!\!\perp Y \mid Z </math>, we aim to show : <math> \begin{align} X \perp\!\!\!\perp Y \mid (Z, h(X)) \quad &\Leftrightarrow \quad X \perp\!\!\!\perp Y \mid U \quad &\text{where} \quad U = (Z, h(X)) \\ &\Leftrightarrow \quad Y \perp\!\!\!\perp X \mid U \quad &\text{Symmetry} \\ &\Leftrightarrow \quad P(Y\mid X, U) = P(Y\mid U) \\ &\Leftrightarrow \quad P(Y \mid X, Z, h(X)) = P(Y \mid Z, h(X)) \end{align} </math> . We begin with the left side of the equation : <math> \begin{align} P(Y \mid X, Z, h(X)) &= P(Y \mid X, Z) \\ &= P(Y \mid Z) &\text{Since by symmetry } Y \perp\!\!\!\perp X \mid Z \end{align} </math> . From the given condition : <math> \begin{align} X \perp\!\!\!\perp Y \mid Z \quad &\Rightarrow \quad h(X) \perp\!\!\!\perp Y \mid Z \quad &\text{Decomposition} \\ &\Leftrightarrow \quad Y \perp\!\!\!\perp h(X) \mid Z \quad &\text{Symmetry} \\ &\Rightarrow \quad P(Y \mid Z, h(X)) = P(Y \mid Z) \end{align} </math> . Thus <math> P(Y \mid X, Z, h(X)) = P(Y \mid Z, h(X)) </math>, so we have shown that <math> X \perp\!\!\!\perp Y \mid (Z, h(X)) </math>. '''Special Cases:''' Some textbooks present the property as * <math> X \perp\!\!\!\perp (Y, W) \mid Z \quad \Rightarrow \quad X \perp\!\!\!\perp Y \mid (Z, W) </math> <ref name="Koller">{{cite book |last1=Koller |first1=Daphne |last2=Friedman |first2=Nir |title=Probabilistic Graphical Models |date=2009 |publisher=The MIT Press |location=Cambridge, MA |isbn=9780262013192}}</ref>. * <math> (X,W) \perp\!\!\!\perp Y \mid Z \quad \Rightarrow \quad X \perp\!\!\!\perp Y \mid (Z,W) </math>. Both versions can be shown to follow from the weak union property given initially via the same method as in the decomposition section above. ===Contraction=== : <math> \left.\begin{align} X \perp\!\!\!\perp A \mid B \\ X \perp\!\!\!\perp B \end{align}\right\}\text{ and } \quad \Rightarrow \quad X \perp\!\!\!\perp A,B </math> '''Proof''' This property can be proved by noticing <math>\Pr(X\mid A,B) = \Pr(X\mid B) = \Pr(X)</math>, each equality of which is asserted by <math>X \perp\!\!\!\perp A \mid B</math> and <math>X \perp\!\!\!\perp B</math>, respectively. ===Intersection=== For strictly positive probability distributions,<ref name=pearl:2000 /> the following also holds: : <math> \left.\begin{align} X \perp\!\!\!\perp Y \mid Z, W\\ X \perp\!\!\!\perp W \mid Z, Y \end{align}\right\}\text{ and } \quad \Rightarrow \quad X \perp\!\!\!\perp W, Y \mid Z </math> '''Proof''' By assumption: : <math>P(X|Z, W, Y) = P(X|Z, W) \land P(X|Z, W, Y) = P(X|Z, Y) \implies P(X|Z, Y) = P(X|Z, W)</math> Using this equality, together with the [[Law of total probability]] applied to <math>P(X|Z)</math>: : <math>\begin{align} P(X|Z) &= \sum_{w \in W} P(X|Z, W=w)P(W=w|Z) \\[4pt] &= \sum_{w \in W} P(X|Y, Z)P(W=w|Z) \\[4pt] &= P(X|Z, Y) \sum_{w \in W} P(W=w|Z) \\[4pt] &= P(X|Z, Y) \end{align}</math> Since <math>P(X|Z, W, Y) = P(X|Z, Y)</math> and <math>P(X|Z, Y) = P(X|Z)</math>, it follows that <math>P(X|Z, W, Y) = P(X|Z) \iff X \perp\!\!\!\perp Y,W | Z</math>. Technical note: since these implications hold for any probability space, they will still hold if one considers a sub-universe by conditioning everything on another variable, say ''K''. For example, <math>X \perp\!\!\!\perp Y \Rightarrow Y \perp\!\!\!\perp X</math> would also mean that <math>X \perp\!\!\!\perp Y \mid K \Rightarrow Y \perp\!\!\!\perp X \mid K</math>. ==See also== *[[Graphoid]] *[[Conditional dependence]] *[[de Finetti's theorem]] *[[Conditional expectation]] ==References== {{Reflist}} ==External links== *{{Commons category-inline}} {{DEFAULTSORT:Conditional Independence}} [[Category:Independence (probability theory)]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Cite book
(
edit
)
Template:Cite conference
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite web
(
edit
)
Template:Commons category-inline
(
edit
)
Template:EquationRef
(
edit
)
Template:Equation box 1
(
edit
)
Template:Font color
(
edit
)
Template:Probability fundamentals
(
edit
)
Template:Reflist
(
edit
)
Template:See also
(
edit
)
Template:Sfrac
(
edit
)
Template:Short description
(
edit
)