Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Independence (probability theory)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Properties== ===Self-independence=== Note that an event is independent of itself if and only if :<math>\mathrm{P}(A) = \mathrm{P}(A \cap A) = \mathrm{P}(A) \cdot \mathrm{P}(A) \iff \mathrm{P}(A) = 0 \text{ or } \mathrm{P}(A) = 1.</math> Thus an event is independent of itself if and only if it [[almost surely]] occurs or its [[Complement (set theory)|complement]] almost surely occurs; this fact is useful when proving [[zero–one law]]s.<ref>{{cite book|last=Durrett|first=Richard|author-link=Rick Durrett|title=Probability: theory and examples|edition=Second|year=1996}} page 62</ref> ===Expectation and covariance=== {{main|Correlation and dependence}} If <math>X</math> and <math>Y</math> are statistically independent random variables, then the [[expected value|expectation operator]] <math>\operatorname{E}</math> has the property :<math>\operatorname{E}[X^n Y^m] = \operatorname{E}[X^n] \operatorname{E}[Y^m],</math><ref name=JakemanBook>{{cite book | author=E Jakeman| title=MODELING FLUCTUATIONS IN SCATTERED WAVES| isbn=978-0-7503-1005-5}}</ref>{{rp|p. 10}} and the [[covariance]] <math>\operatorname{cov}[X,Y]</math> is zero, as follows from :<math>\operatorname{cov}[X,Y] = \operatorname{E}[X Y] - \operatorname{E}[X] \operatorname{E}[Y].</math> The converse does not hold: if two random variables have a covariance of 0 they still may be not independent. {{See also|Uncorrelatedness (probability theory)}} Similarly for two stochastic processes <math>\left\{ X_t \right\}_{t\in\mathcal{T}}</math> and <math>\left\{ Y_t \right\}_{t\in\mathcal{T}}</math>: If they are independent, then they are [[Uncorrelatedness (probability theory)|uncorrelated]].<ref name=KunIlPark>{{cite book | author=Park, Kun Il| title=Fundamentals of Probability and Stochastic Processes with Applications to Communications| publisher=Springer | year=2018 | isbn=978-3-319-68074-3}}</ref>{{rp|p. 151}} ===Characteristic function=== Two random variables <math>X</math> and <math>Y</math> are independent if and only if the [[characteristic function (probability theory)|characteristic function]] of the random vector <math>(X,Y)</math> satisfies :<math>\varphi_{(X,Y)}(t,s) = \varphi_{X}(t)\cdot \varphi_{Y}(s).</math> In particular the characteristic function of their sum is the product of their marginal characteristic functions: :<math>\varphi_{X+Y}(t) = \varphi_X(t)\cdot\varphi_Y(t),</math> though the reverse implication is not true. Random variables that satisfy the latter condition are called [[subindependence|subindependent]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)