Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Covariance
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Uncorrelatedness and independence=== {{main|Correlation and dependence}} Random variables whose covariance is zero are called [[uncorrelated]].<ref name=KunIlPark/>{{rp|p= 121}} Similarly, the components of random vectors whose [[covariance matrix]] is zero in every entry outside the main diagonal are also called uncorrelated. If <math>X</math> and <math>Y</math> are [[statistical independence|independent random variables]], then their covariance is zero.<ref name=KunIlPark/>{{rp|p= 123}}<ref>{{Cite web | url=http://www.randomservices.org/random/expect/Covariance.html| title=Covariance and Correlation | last=Siegrist|first=Kyle| publisher=University of Alabama in Huntsville |access-date=Oct 3, 2022}}</ref> This follows because under independence, <math display="block">\operatorname{E}[XY] = \operatorname{E}[X] \cdot \operatorname{E}[Y]. </math> The converse, however, is not generally true. For example, let <math>X</math> be uniformly distributed in <math>[-1,1]</math> and let <math>Y = X^2</math>. Clearly, <math>X</math> and <math>Y</math> are not independent, but <math display="block">\begin{align} \operatorname{cov}(X, Y) &= \operatorname{cov}\left(X, X^2\right) \\ &= \operatorname{E}\left[X \cdot X^2\right] - \operatorname{E}[X] \cdot \operatorname{E}\left[X^2\right] \\ &= \operatorname{E}\left[X^3\right] - \operatorname{E}[X]\operatorname{E}\left[X^2\right] \\ &= 0 - 0 \cdot \operatorname{E}[X^2] \\ &= 0. \end{align}</math> In this case, the relationship between <math>Y</math> and <math>X</math> is non-linear, while correlation and covariance are measures of linear dependence between two random variables. This example shows that if two random variables are uncorrelated, that does not in general imply that they are independent. However, if two variables are [[Multivariate normal distribution|jointly normally distributed]] (but not if they are merely [[Normally distributed and uncorrelated does not imply independent|individually normally distributed]]), uncorrelatedness ''does'' imply independence.<ref>{{Cite book |title=A modern introduction to probability and statistics: understanding why and how |date=2005 |publisher=Springer |isbn=978-1-85233-896-1 |editor-last=Dekking |editor-first=Michel |series=Springer texts in statistics |location=London [Heidelberg]}}</ref> <math>X</math> and <math>Y</math> whose covariance is positive are called positively correlated, which implies if <math>X>E[X]</math> then likely <math>Y>E[Y]</math>. Conversely, <math>X</math> and <math>Y</math> with negative covariance are negatively correlated, and if <math>X>E[X]</math> then likely <math>Y<E[Y]</math>.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)