Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Conditional independence
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Conditional independence of random variables== Two discrete [[random variable]]s <math>X</math> and <math>Y</math> are conditionally independent given a third discrete random variable <math>Z</math> if and only if they are [[Independence (probability theory)|independent]] in their [[conditional probability distribution]] given <math>Z</math>. That is, <math>X</math> and <math>Y</math> are conditionally independent given <math>Z</math> if and only if, given any value of <math>Z</math>, the probability distribution of <math>X</math> is the same for all values of <math>Y</math> and the probability distribution of <math>Y</math> is the same for all values of <math>X</math>. Formally: {{Equation box 1 |indent = |title= |equation = {{NumBlk||<math>(X \perp\!\!\!\perp Y) \mid Z \quad \iff \quad F_{X,Y\,\mid\,Z\,=\,z}(x,y) = F_{X\,\mid\,Z\,=\,z}(x) \cdot F_{Y\,\mid\,Z\,=\,z}(y) \quad \text{for all } x,y,z</math>|{{EquationRef|Eq.2}}}} |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} where <math>F_{X,Y\,\mid\,Z\,=\,z}(x,y)=\Pr(X \leq x, Y \leq y \mid Z=z)</math> is the conditional [[cumulative distribution function]] of <math>X</math> and <math>Y</math> given <math>Z</math>. Two events <math>R</math> and <math>B</math> are conditionally independent given a [[sigma-algebra|σ-algebra]] <math>\Sigma</math> if :<math>\Pr(R, B \mid \Sigma) = \Pr(R \mid \Sigma)\Pr(B \mid \Sigma) \text{ a.s.}</math> where <math>\Pr(A \mid \Sigma) </math> denotes the [[conditional expectation]] of the [[indicator function]] of the event <math>A</math>, <math>\chi_A</math>, given the sigma algebra <math>\Sigma</math>. That is, :<math>\Pr(A \mid \Sigma) := \operatorname{E}[\chi_A\mid\Sigma].</math> Two random variables <math>X</math> and <math>Y</math> are conditionally independent given a Ο-algebra <math>\Sigma</math> if the above equation holds for all <math>R</math> in <math>\sigma(X)</math> and <math>B</math> in <math>\sigma(Y)</math>. Two random variables <math>X</math> and <math>Y</math> are conditionally independent given a random variable <math>W</math> if they are independent given ''Ο''(''W''): the Ο-algebra generated by <math>W</math>. This is commonly written: :<math>X \perp\!\!\!\perp Y \mid W </math> or :<math>X \perp Y \mid W</math> This is read "<math>X</math> is independent of <math>Y</math>, '''given''' <math>W</math>"; the conditioning applies to the whole statement: "(<math>X</math> is independent of <math>Y</math>) given <math>W</math>". :<math>(X \perp\!\!\!\perp Y) \mid W</math> This notation extends <math>X \perp\!\!\!\perp Y</math> for "<math>X</math> is [[Independence (probability theory)|independent]] of <math>Y</math>." If <math>W</math> assumes a countable set of values, this is equivalent to the conditional independence of ''X'' and ''Y'' for the events of the form <math>[W=w]</math>. Conditional independence of more than two events, or of more than two random variables, is defined analogously. The following two examples show that <math>X \perp\!\!\!\perp Y</math> ''neither implies nor is implied by'' <math>(X \perp\!\!\!\perp Y) \mid W</math>. First, suppose <math>W</math> is 0 with probability 0.5 and 1 otherwise. When ''W'' = 0 take <math>X</math> and <math>Y</math> to be independent, each having the value 0 with probability 0.99 and the value 1 otherwise. When <math>W=1</math>, <math>X</math> and <math>Y</math> are again independent, but this time they take the value 1 with probability 0.99. Then <math>(X \perp\!\!\!\perp Y) \mid W</math>. But <math>X</math> and <math>Y</math> are dependent, because Pr(''X'' = 0) < Pr(''X'' = 0|''Y'' = 0). This is because Pr(''X'' = 0) = 0.5, but if ''Y'' = 0 then it's very likely that ''W'' = 0 and thus that ''X'' = 0 as well, so Pr(''X'' = 0|''Y'' = 0) > 0.5. For the second example, suppose <math>X \perp\!\!\!\perp Y</math>, each taking the values 0 and 1 with probability 0.5. Let <math>W</math> be the product <math>X \cdot Y</math>. Then when <math>W=0</math>, Pr(''X'' = 0) = 2/3, but Pr(''X'' = 0|''Y'' = 0) = 1/2, so <math>(X \perp\!\!\!\perp Y) \mid W</math> is false. This is also an example of Explaining Away. See Kevin Murphy's tutorial <ref>{{Cite web|url=http://people.cs.ubc.ca/~murphyk/Bayes/bnintro.html|title=Graphical Models}}</ref> where <math>X</math> and <math>Y</math> take the values "brainy" and "sporty".
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)