Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Normal distribution
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Operations and functions of normal variables === [[File:Probabilities of functions of normal vectors.png|thumb|right|'''a:''' Probability density of a function {{math|cos ''x''{{sup|2}}}} of a normal variable {{mvar|x}} with {{math|1= ''μ'' = −2}} and {{math|1= ''σ'' = 3}}. '''b:''' Probability density of a function {{mvar|x{{sup|y}}}} of two normal variables {{mvar|x}} and {{mvar|y}}, where {{math|1= ''μ{{sub|x}}'' = 1}}, {{math|1= ''μ{{sub|y}}'' = 2}}, {{math|1= ''σ{{sub|x}}'' = 0.1}}, {{math|1= ''σ{{sub|y}}'' = 0.2}}, and {{math|1= ''ρ{{sub|xy}}'' = 0.8}}. '''c:''' Heat map of the joint probability density of two functions of two correlated normal variables {{mvar|x}} and {{mvar|y}}, where {{math|1= ''μ{{sub|x}}'' = −2}}, {{math|1= ''μ{{sub|x}}'' = 5}}, {{math|1= {{subsup|σ|s=0|''x''|2}} = 10}}, {{math|1= {{subsup|σ|s=0|''y''|2}} = 20}}, and {{math|1= ''ρ{{sub|xy}}'' = 0.495}}. '''d:''' Probability density of a function {{math|{{abs|''x''{{sub|1}}}} + ... + {{abs|''x''{{sub|4}}}}}} of four [[iid]] standard normal variables. These are computed by the numerical method of ray-tracing.<ref name="Das-2021" />]] The [[probability density]], [[cumulative distribution function|cumulative distribution]], and [[inverse cumulative distribution function|inverse cumulative distribution]] of any function of one or more independent or correlated normal variables can be computed with the numerical method of ray-tracing<ref name="Das-2021">{{cite journal | last=Das|first=Abhranil| arxiv=2012.14331| title=A method to integrate and classify normal distributions|journal=Journal of Vision |date=2021|volume=21 |issue=10 |page=1 |doi=10.1167/jov.21.10.1 |pmid=34468706 |pmc=8419883 }}</ref> ([https://www.mathworks.com/matlabcentral/fileexchange/84973-integrate-and-classify-normal-distributions Matlab code]). In the following sections we look at some special cases. ==== Operations on a single normal variable ==== If {{tmath|X}} is distributed normally with mean {{tmath|\mu}} and variance <math display=inline>\sigma^2</math>, then * <math display=inline>aX+b</math>, for any real numbers {{tmath|a}} and {{tmath|b}}, is also normally distributed, with mean <math display=inline>a\mu+b</math> and variance <math display=inline>a^2\sigma^2</math>. That is, the family of normal distributions is closed under [[linear transformations]]. * The exponential of {{tmath|X}} is distributed [[Log-normal distribution|log-normally]]: <math display=inline>e^X \sim \ln(N(\mu, \sigma^2))</math>. * The standard [[logistic function|sigmoid]] of {{tmath|X}} is [[Logit-normal distribution|logit-normally distributed]]: <math display=inline>\sigma(X) \sim P( \mathcal{N}(\mu,\,\sigma^2) )</math>. * The absolute value of {{tmath|X}} has [[folded normal distribution]]: <math display=inline>{{\left| X \right| \sim N_f(\mu, \sigma^2)}}</math>. If <math display=inline>\mu = 0</math> this is known as the [[half-normal distribution]]. * The absolute value of normalized residuals, <math display=inline>|X - \mu| / \sigma</math>, has [[chi distribution]] with one degree of freedom: <math display=inline>|X - \mu| / \sigma \sim \chi_1</math>. * The square of <math display=inline>X/\sigma</math> has the [[noncentral chi-squared distribution]] with one degree of freedom: <math display=inline>X^2 / \sigma^2 \sim \chi_1^2(\mu^2 / \sigma^2)</math>. If <math display=inline>\mu = 0</math>, the distribution is called simply [[chi-squared distribution|chi-squared]]. * The log-likelihood of a normal variable {{tmath|x}} is simply the log of its [[probability density function]]: <math display=block>\ln p(x)= -\frac{1}{2} \left(\frac{x-\mu}{\sigma} \right)^2 -\ln \left(\sigma \sqrt{2\pi} \right).</math> Since this is a scaled and shifted square of a standard normal variable, it is distributed as a scaled and shifted [[chi-squared distribution|chi-squared]] variable. * The distribution of the variable {{tmath|X}} restricted to an interval <math display=inline>[a, b]</math> is called the [[truncated normal distribution]]. * <math display=inline>(X - \mu)^{-2}</math> has a [[Lévy distribution]] with location 0 and scale <math display=inline>\sigma^{-2}</math>. ===== Operations on two independent normal variables ===== * If <math display=inline>X_1</math> and <math display=inline>X_2</math> are two [[independence (probability theory)|independent]] normal random variables, with means <math display=inline>\mu_1</math>, <math display=inline>\mu_2</math> and variances <math display=inline>\sigma_1^2</math>, <math display=inline>\sigma_2^2</math>, then their sum <math display=inline>X_1 + X_2</math> will also be normally distributed,<sup>[[sum of normally distributed random variables|[proof]]]</sup> with mean <math display=inline>\mu_1 + \mu_2</math> and variance <math display=inline>\sigma_1^2 + \sigma_2^2</math>. * In particular, if {{tmath|X}} and {{tmath|Y}} are independent normal deviates with zero mean and variance <math display=inline>\sigma^2</math>, then <math display=inline>X + Y</math> and <math display=inline>X - Y</math> are also independent and normally distributed, with zero mean and variance <math display=inline>2\sigma^2</math>. This is a special case of the [[polarization identity]].<ref>{{harvtxt |Bryc |1995 |p=27 }}</ref> * If <math display=inline>X_1</math>, <math display=inline>X_2</math> are two independent normal deviates with mean {{tmath|\mu}} and variance <math display=inline>\sigma^2</math>, and {{tmath|a}}, {{tmath|b}} are arbitrary real numbers, then the variable <math display=block> X_3 = \frac{aX_1 + bX_2 - (a+b)\mu}{\sqrt{a^2+b^2}} + \mu </math> is also normally distributed with mean {{tmath|\mu}} and variance <math display=inline>\sigma^2</math>. It follows that the normal distribution is [[stable distribution|stable]] (with exponent <math display=inline>\alpha=2</math>). * If <math display=inline>X_k \sim \mathcal N(m_k, \sigma_k^2)</math>, <math display=inline>k \in \{ 0, 1 \}</math> are normal distributions, then their normalized [[geometric mean]] <math display=inline>\frac{1}{\int_{\R^n} X_0^{\alpha}(x) X_1^{1 - \alpha}(x) \, \text{d}x} X_0^{\alpha} X_1^{1 - \alpha}</math> is a normal distribution <math display=inline>\mathcal N(m_{\alpha}, \sigma_{\alpha}^2)</math> with <math display=inline>m_{\alpha} = \frac{\alpha m_0 \sigma_1^2 + (1 - \alpha) m_1 \sigma_0^2}{\alpha \sigma_1^2 + (1 - \alpha) \sigma_0^2}</math> and <math display=inline>\sigma_{\alpha}^2 = \frac{\sigma_0^2 \sigma_1^2}{\alpha \sigma_1^2 + (1 - \alpha) \sigma_0^2}</math>. ===== Operations on two independent standard normal variables ===== If <math display=inline>X_1</math> and <math display=inline>X_2</math> are two independent standard normal random variables with mean 0 and variance 1, then * Their sum and difference is distributed normally with mean zero and variance two: <math display=inline>X_1 \pm X_2 \sim \mathcal{N}(0, 2)</math>. * Their product <math display=inline>Z = X_1 X_2</math> follows the [[product distribution#Independent central-normal distributions|product distribution]]<ref>{{cite web|url = http://mathworld.wolfram.com/NormalProductDistribution.html |title = Normal Product Distribution|work = MathWorld |publisher =wolfram.com| first = Eric W. |last = Weisstein}}</ref> with density function <math display=inline>f_Z(z) = \pi^{-1} K_0(|z|)</math> where <math display=inline>K_0</math> is the [[Macdonald function|modified Bessel function of the second kind]]. This distribution is symmetric around zero, unbounded at <math display=inline>z = 0</math>, and has the [[characteristic function (probability theory)|characteristic function]] <math display=inline>\phi_Z(t) = (1 + t^2)^{-1/2}</math>. * Their ratio follows the standard [[Cauchy distribution]]: <math display=inline>X_1/ X_2 \sim \operatorname{Cauchy}(0, 1)</math>. * Their Euclidean norm <math display=inline>\sqrt{X_1^2 + X_2^2}</math> has the [[Rayleigh distribution]]. ==== Operations on multiple independent normal variables ==== * Any [[linear combination]] of independent normal deviates is a normal deviate. * If <math display=inline>X_1, X_2, \ldots, X_n</math> are independent standard normal random variables, then the sum of their squares has the [[chi-squared distribution]] with {{tmath|n}} degrees of freedom <math display=block>X_1^2 + \cdots + X_n^2 \sim \chi_n^2.</math> * If <math display=inline>X_1, X_2, \ldots, X_n</math> are independent normally distributed random variables with means {{tmath|\mu}} and variances <math display=inline>\sigma^2</math>, then their [[sample mean]] is independent from the sample [[standard deviation]],<ref>{{cite journal|title=A Characterization of the Normal Distribution |last=Lukacs |first=Eugene |journal=[[The Annals of Mathematical Statistics]] |issn=0003-4851 |volume=13|issue=1 |year=1942 |pages=91–3 |jstor=2236166 |doi=10.1214/aoms/1177731647 |doi-access=free}}</ref> which can be demonstrated using [[Basu's theorem]] or [[Cochran's theorem]].<ref>{{cite journal |title=On Some Characterizations of the Normal Distribution | last1=Basu|first1=D. |last2=Laha|first2=R. G.|journal=[[Sankhyā (journal)|Sankhyā]]|issn=0036-4452| volume=13|issue=4|year=1954|pages=359–62| jstor=25048183}}</ref> The ratio of these two quantities will have the [[Student's t-distribution]] with <math display=inline>n-1</math> degrees of freedom: <math display=block>t = \frac{\overline X - \mu}{S/\sqrt{n}} = \frac{\frac{1}{n}(X_1+\cdots+X_n) - \mu}{\sqrt{\frac{1}{n(n-1)}\left[(X_1-\overline X)^2 + \cdots+(X_n-\overline X)^2\right]}} \sim t_{n-1}.</math> * If <math display=inline>X_1, X_2, \ldots, X_n</math>, <math display=inline>Y_1, Y_2, \ldots, Y_m</math> are independent standard normal random variables, then the ratio of their normalized sums of squares will have the [[F-distribution]] with {{math|(''n'', ''m'')}} degrees of freedom:<ref>{{cite book |title=Testing Statistical Hypotheses |edition=2nd | first=E. L. | last=Lehmann | publisher=Springer |year=1997 | isbn=978-0-387-94919-2| page=199}}</ref> <math display=block>F = \frac{\left(X_1^2+X_2^2+\cdots+X_n^2\right)/n}{\left(Y_1^2+Y_2^2+\cdots+Y_m^2\right)/m} \sim F_{n,m}.</math> ==== Operations on multiple correlated normal variables ==== * A [[quadratic form]] of a normal vector, i.e. a quadratic function <math display=inline>q = \sum x_i^2 + \sum x_j + c</math> of multiple independent or correlated normal variables, is a [[generalized chi-square distribution|generalized chi-square]] variable.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)