Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Normal distribution
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==== Operations on a single normal variable ==== If {{tmath|X}} is distributed normally with mean {{tmath|\mu}} and variance <math display=inline>\sigma^2</math>, then * <math display=inline>aX+b</math>, for any real numbers {{tmath|a}} and {{tmath|b}}, is also normally distributed, with mean <math display=inline>a\mu+b</math> and variance <math display=inline>a^2\sigma^2</math>. That is, the family of normal distributions is closed under [[linear transformations]]. * The exponential of {{tmath|X}} is distributed [[Log-normal distribution|log-normally]]: <math display=inline>e^X \sim \ln(N(\mu, \sigma^2))</math>. * The standard [[logistic function|sigmoid]] of {{tmath|X}} is [[Logit-normal distribution|logit-normally distributed]]: <math display=inline>\sigma(X) \sim P( \mathcal{N}(\mu,\,\sigma^2) )</math>. * The absolute value of {{tmath|X}} has [[folded normal distribution]]: <math display=inline>{{\left| X \right| \sim N_f(\mu, \sigma^2)}}</math>. If <math display=inline>\mu = 0</math> this is known as the [[half-normal distribution]]. * The absolute value of normalized residuals, <math display=inline>|X - \mu| / \sigma</math>, has [[chi distribution]] with one degree of freedom: <math display=inline>|X - \mu| / \sigma \sim \chi_1</math>. * The square of <math display=inline>X/\sigma</math> has the [[noncentral chi-squared distribution]] with one degree of freedom: <math display=inline>X^2 / \sigma^2 \sim \chi_1^2(\mu^2 / \sigma^2)</math>. If <math display=inline>\mu = 0</math>, the distribution is called simply [[chi-squared distribution|chi-squared]]. * The log-likelihood of a normal variable {{tmath|x}} is simply the log of its [[probability density function]]: <math display=block>\ln p(x)= -\frac{1}{2} \left(\frac{x-\mu}{\sigma} \right)^2 -\ln \left(\sigma \sqrt{2\pi} \right).</math> Since this is a scaled and shifted square of a standard normal variable, it is distributed as a scaled and shifted [[chi-squared distribution|chi-squared]] variable. * The distribution of the variable {{tmath|X}} restricted to an interval <math display=inline>[a, b]</math> is called the [[truncated normal distribution]]. * <math display=inline>(X - \mu)^{-2}</math> has a [[LΓ©vy distribution]] with location 0 and scale <math display=inline>\sigma^{-2}</math>. ===== Operations on two independent normal variables ===== * If <math display=inline>X_1</math> and <math display=inline>X_2</math> are two [[independence (probability theory)|independent]] normal random variables, with means <math display=inline>\mu_1</math>, <math display=inline>\mu_2</math> and variances <math display=inline>\sigma_1^2</math>, <math display=inline>\sigma_2^2</math>, then their sum <math display=inline>X_1 + X_2</math> will also be normally distributed,<sup>[[sum of normally distributed random variables|[proof]]]</sup> with mean <math display=inline>\mu_1 + \mu_2</math> and variance <math display=inline>\sigma_1^2 + \sigma_2^2</math>. * In particular, if {{tmath|X}} and {{tmath|Y}} are independent normal deviates with zero mean and variance <math display=inline>\sigma^2</math>, then <math display=inline>X + Y</math> and <math display=inline>X - Y</math> are also independent and normally distributed, with zero mean and variance <math display=inline>2\sigma^2</math>. This is a special case of the [[polarization identity]].<ref>{{harvtxt |Bryc |1995 |p=27 }}</ref> * If <math display=inline>X_1</math>, <math display=inline>X_2</math> are two independent normal deviates with mean {{tmath|\mu}} and variance <math display=inline>\sigma^2</math>, and {{tmath|a}}, {{tmath|b}} are arbitrary real numbers, then the variable <math display=block> X_3 = \frac{aX_1 + bX_2 - (a+b)\mu}{\sqrt{a^2+b^2}} + \mu </math> is also normally distributed with mean {{tmath|\mu}} and variance <math display=inline>\sigma^2</math>. It follows that the normal distribution is [[stable distribution|stable]] (with exponent <math display=inline>\alpha=2</math>). * If <math display=inline>X_k \sim \mathcal N(m_k, \sigma_k^2)</math>, <math display=inline>k \in \{ 0, 1 \}</math> are normal distributions, then their normalized [[geometric mean]] <math display=inline>\frac{1}{\int_{\R^n} X_0^{\alpha}(x) X_1^{1 - \alpha}(x) \, \text{d}x} X_0^{\alpha} X_1^{1 - \alpha}</math> is a normal distribution <math display=inline>\mathcal N(m_{\alpha}, \sigma_{\alpha}^2)</math> with <math display=inline>m_{\alpha} = \frac{\alpha m_0 \sigma_1^2 + (1 - \alpha) m_1 \sigma_0^2}{\alpha \sigma_1^2 + (1 - \alpha) \sigma_0^2}</math> and <math display=inline>\sigma_{\alpha}^2 = \frac{\sigma_0^2 \sigma_1^2}{\alpha \sigma_1^2 + (1 - \alpha) \sigma_0^2}</math>. ===== Operations on two independent standard normal variables ===== If <math display=inline>X_1</math> and <math display=inline>X_2</math> are two independent standard normal random variables with mean 0 and variance 1, then * Their sum and difference is distributed normally with mean zero and variance two: <math display=inline>X_1 \pm X_2 \sim \mathcal{N}(0, 2)</math>. * Their product <math display=inline>Z = X_1 X_2</math> follows the [[product distribution#Independent central-normal distributions|product distribution]]<ref>{{cite web|url = http://mathworld.wolfram.com/NormalProductDistribution.html |title = Normal Product Distribution|work = MathWorld |publisher =wolfram.com| first = Eric W. |last = Weisstein}}</ref> with density function <math display=inline>f_Z(z) = \pi^{-1} K_0(|z|)</math> where <math display=inline>K_0</math> is the [[Macdonald function|modified Bessel function of the second kind]]. This distribution is symmetric around zero, unbounded at <math display=inline>z = 0</math>, and has the [[characteristic function (probability theory)|characteristic function]] <math display=inline>\phi_Z(t) = (1 + t^2)^{-1/2}</math>. * Their ratio follows the standard [[Cauchy distribution]]: <math display=inline>X_1/ X_2 \sim \operatorname{Cauchy}(0, 1)</math>. * Their Euclidean norm <math display=inline>\sqrt{X_1^2 + X_2^2}</math> has the [[Rayleigh distribution]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)