Central moment
Template:Short description Template:Use American English Template:Refimprove In probability theory and statistics, a central moment is a moment of a probability distribution of a random variable about the random variable's mean; that is, it is the expected value of a specified integer power of the deviation of the random variable from the mean. The various moments form one set of values by which the properties of a probability distribution can be usefully characterized. Central moments are used in preference to ordinary moments, computed in terms of deviations from the mean instead of from zero, because the higher-order central moments relate only to the spread and shape of the distribution, rather than also to its location.
Sets of central moments can be defined for both univariate and multivariate distributions.
Univariate momentsEdit
The Template:Mvar-th moment about the mean (or Template:Mvar-th central moment) of a real-valued random variable Template:Mvar is the quantity Template:Math, where E is the expectation operator. For a continuous univariate probability distribution with probability density function Template:Math, the Template:Mvar-th moment about the mean Template:Mvar is<ref name="ProbRand">Template:Cite book</ref> <math display="block"> \mu_n = \operatorname{E} \left[ {\left( X - \operatorname{E}[X] \right)}^n \right] = \int_{-\infty}^{+\infty} (x - \mu)^n f(x)\,\mathrm{d} x. </math>
For random variables that have no mean, such as the Cauchy distribution, central moments are not defined.
The first few central moments have intuitive interpretations:
- The "zeroth" central moment Template:Math is 1.
- The first central moment Template:Math is 0 (not to be confused with the first raw moment or the expected value Template:Mvar).
- The second central moment Template:Math is called the variance, and is usually denoted Template:Math, where Template:Mvar represents the standard deviation.
- The third and fourth central moments are used to define the standardized moments which are used to define skewness and kurtosis, respectively.
PropertiesEdit
For all Template:Mvar, the Template:Mvar-th central moment is homogeneous of degree Template:Mvar:
<math display="block">\mu_n(cX) = c^n \mu_n(X).\,</math>
Only for Template:Mvar such that n equals 1, 2, or 3 do we have an additivity property for random variables Template:Mvar and Template:Mvar that are independent:
<math display="block">\mu_n(X+Y) = \mu_n(X)+\mu_n(Y)\,</math> provided n ∈ Template:Math.
A related functional that shares the translation-invariance and homogeneity properties with the Template:Mvar-th central moment, but continues to have this additivity property even when Template:Math is the Template:Mvar-th cumulant Template:Math. For Template:Math, the Template:Mvar-th cumulant is just the expected value; for Template:Mvar = either 2 or 3, the Template:Mvar-th cumulant is just the Template:Mvar-th central moment; for Template:Math, the Template:Mvar-th cumulant is an Template:Mvar-th-degree monic polynomial in the first Template:Mvar moments (about zero), and is also a (simpler) Template:Mvar-th-degree polynomial in the first Template:Mvar central moments.
Relation to moments about the originEdit
Sometimes it is convenient to convert moments about the origin to moments about the mean. The general equation for converting the Template:Mvar-th-order moment about the origin to the moment about the mean is
<math display="block"> \mu_n = \operatorname{E}\left[\left(X - \operatorname{E}[X]\right)^n\right] = \sum_{j=0}^n \binom{n}{j} {\left(-1\right)}^{n-j} \mu'_j \mu^{n-j}, </math>
where Template:Mvar is the mean of the distribution, and the moment about the origin is given by
<math display="block"> \mu'_m = \int_{-\infty}^{+\infty} x^m f(x)\,dx = \operatorname{E}[X^m] = \sum_{j=0}^m \binom{m}{j} \mu_j \mu^{m-j}. </math>
For the cases Template:Math — which are of most interest because of the relations to variance, skewness, and kurtosis, respectively — this formula becomes (noting that <math>\mu = \mu'_1</math> and <math>\mu'_0=1</math>):
<math display="block">\mu_2 = \mu'_2 - \mu^2\,</math> which is commonly referred to as <math> \operatorname{Var}(X) = \operatorname{E}[X^2] - \left(\operatorname{E}[X]\right)^2</math>
<math display="block">\begin{align} \mu_3 &= \mu'_3 - 3 \mu \mu'_2 +2 \mu^3 \\ \mu_4 &= \mu'_4 - 4 \mu \mu'_3 + 6 \mu^2 \mu'_2 - 3 \mu^4. \end{align}</math>
... and so on,<ref>{{#invoke:citation/CS1|citation |CitationClass=web }}</ref> following Pascal's triangle, i.e.
<math display="block">\mu_5 = \mu'_5 - 5 \mu \mu'_4 + 10 \mu^2 \mu'_3 - 10 \mu^3 \mu'_2 + 4 \mu^5.\,</math>
because Template:Nowrap
The following sum is a stochastic variable having a compound distribution
<math display="block">W = \sum_{i=1}^M Y_i, </math>
where the <math>Y_i</math> are mutually independent random variables sharing the same common distribution and <math>M</math> a random integer variable independent of the <math>Y_k</math> with its own distribution. The moments of <math>W</math> are obtained as
<math display="block">\operatorname{E}[W^n]= \sum_{i=0}^n\operatorname{E}\left[\binom{M}{i}\right] \sum_{j=0}^i \binom{i}{j} {\left(-1\right)}^{i-j} \operatorname{E} \left[ \left(\sum_{k=1}^j Y_k\right)^n \right], </math>
where <math display="inline">\operatorname{E} \left[ {\left(\sum_{k=1}^j Y_k\right)}^n\right] </math> is defined as zero for <math>j = 0</math>.
Symmetric distributionsEdit
In distributions that are symmetric about their means (unaffected by being reflected about the mean), all odd central moments equal zero whenever they exist, because in the formula for the Template:Mvar-th moment, each term involving a value of Template:Mvar less than the mean by a certain amount exactly cancels out the term involving a value of Template:Mvar greater than the mean by the same amount.
Multivariate momentsEdit
For a continuous bivariate probability distribution with probability density function Template:Math the Template:Math moment about the mean Template:Math is <math display="block"> \begin{align} \mu_{j,k} &= \operatorname{E} \left[ {\left( X - \operatorname{E}[X] \right)}^j {\left( Y - \operatorname{E}[Y] \right)}^k \right] \\[2pt] &= \int_{-\infty}^{+\infty} \int_{-\infty}^{+\infty} {\left(x - \mu_X\right)}^j {\left(y - \mu_Y\right)}^k f(x,y) \, dx \, dy. \end{align} </math>
Central moment of complex random variablesEdit
The Template:Mvar-th central moment for a complex random variable Template:Mvar is defined as <ref>Template:Cite book</ref> Template:Equation box 1 The absolute Template:Mvar-th central moment of Template:Mvar is defined as Template:Equation box 1
The 2nd-order central moment Template:Math is called the variance of Template:Mvar whereas the 2nd-order central moment Template:Math is the pseudo-variance of Template:Mvar.