Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Covariance matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Use American English|date = April 2019}} {{Short description|Measure of covariance of components of a random vector}} {{Confuse|Cross-covariance matrix}} [[Image:Gaussian-2d.png|thumb|right|A [[bivariate Gaussian distribution|bivariate Gaussian probability density function]] centered at (0, 0), with covariance matrix given by <math>\begin{bmatrix} 1 & 0.5\\ 0.5 & 1 \end{bmatrix}</math>]] [[Image:GaussianScatterPCA.svg|thumb|right|Sample points from a [[bivariate Gaussian distribution]] with a standard deviation of 3 in roughly the lower left–upper right direction and of 1 in the orthogonal direction. Because the ''x'' and ''y'' components co-vary, the variances of <math>x</math> and <math>y</math> do not fully describe the distribution. A <math>2 \times 2</math> covariance matrix is needed; the directions of the arrows correspond to the [[eigenvector]]s of this covariance matrix and their lengths to the square roots of the [[eigenvalues]].]] {{Correlation and covariance}} In [[probability theory]] and [[statistics]], a '''covariance matrix''' (also known as '''auto-covariance matrix''', '''dispersion matrix''', '''variance matrix''', or '''variance–covariance matrix''') is a square [[Matrix (mathematics)|matrix]] giving the [[covariance]] between each pair of elements of a given [[random vector]]. Intuitively, the covariance matrix generalizes the notion of variance to multiple dimensions. As an example, the variation in a collection of random points in two-dimensional space cannot be characterized fully by a single number, nor would the variances in the <math>x</math> and <math>y</math> directions contain all of the necessary information; a <math>2 \times 2</math> matrix would be necessary to fully characterize the two-dimensional variation. Any [[covariance]] matrix is [[symmetric matrix|symmetric]] and [[positive semi-definite matrix|positive semi-definite]] and its main diagonal contains [[variance]]s (i.e., the covariance of each element with itself). The covariance matrix of a random vector <math>\mathbf{X}</math> is typically denoted by <math>\operatorname{K}_{\mathbf{X}\mathbf{X}}</math>, <math>\Sigma</math> or <math>S</math>.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)