Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Normal distribution
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Bayesian analysis of the normal distribution === Bayesian analysis of normally distributed data is complicated by the many different possibilities that may be considered: * Either the mean, or the variance, or neither, may be considered a fixed quantity. * When the variance is unknown, analysis may be done directly in terms of the variance, or in terms of the [[precision (statistics)|precision]], the reciprocal of the variance. The reason for expressing the formulas in terms of precision is that the analysis of most cases is simplified. * Both univariate and [[multivariate normal distribution|multivariate]] cases need to be considered. * Either [[conjugate prior|conjugate]] or [[improper prior|improper]] [[prior distribution]]s may be placed on the unknown variables. * An additional set of cases occurs in [[Bayesian linear regression]], where in the basic model the data is assumed to be normally distributed, and normal priors are placed on the [[regression coefficient]]s. The resulting analysis is similar to the basic cases of [[independent identically distributed]] data. The formulas for the non-linear-regression cases are summarized in the [[conjugate prior]] article. ==== Sum of two quadratics ==== ===== Scalar form ===== The following auxiliary formula is useful for simplifying the [[posterior distribution|posterior]] update equations, which otherwise become fairly tedious. <math display=block>a(x-y)^2 + b(x-z)^2 = (a + b)\left(x - \frac{ay+bz}{a+b}\right)^2 + \frac{ab}{a+b}(y-z)^2</math> This equation rewrites the sum of two quadratics in ''x'' by expanding the squares, grouping the terms in ''x'', and [[completing the square]]. Note the following about the complex constant factors attached to some of the terms: # The factor <math display=inline>\frac{ay+bz}{a+b}</math> has the form of a [[weighted average]] of ''y'' and ''z''. # <math display=inline>\frac{ab}{a+b} = \frac{1}{\frac{1}{a}+\frac{1}{b}} = (a^{-1} + b^{-1})^{-1}.</math> This shows that this factor can be thought of as resulting from a situation where the [[Multiplicative inverse|reciprocals]] of quantities ''a'' and ''b'' add directly, so to combine ''a'' and ''b'' themselves, it is necessary to reciprocate, add, and reciprocate the result again to get back into the original units. This is exactly the sort of operation performed by the [[harmonic mean]], so it is not surprising that <math display=inline>\frac{ab}{a+b}</math> is one-half the [[harmonic mean]] of ''a'' and ''b''. ===== Vector form ===== A similar formula can be written for the sum of two vector quadratics: If '''x''', '''y''', '''z''' are vectors of length ''k'', and '''A''' and '''B''' are [[symmetric matrix|symmetric]], [[invertible matrices]] of size <math display=inline>k\times k</math>, then <math display=block> \begin{align} & (\mathbf{y}-\mathbf{x})'\mathbf{A}(\mathbf{y}-\mathbf{x}) + (\mathbf{x}-\mathbf{z})' \mathbf{B}(\mathbf{x}-\mathbf{z}) \\ = {} & (\mathbf{x} - \mathbf{c})'(\mathbf{A}+\mathbf{B})(\mathbf{x} - \mathbf{c}) + (\mathbf{y} - \mathbf{z})'(\mathbf{A}^{-1} + \mathbf{B}^{-1})^{-1}(\mathbf{y} - \mathbf{z}) \end{align} </math> where <math display=block>\mathbf{c} = (\mathbf{A} + \mathbf{B})^{-1}(\mathbf{A}\mathbf{y} + \mathbf{B} \mathbf{z})</math> The form '''x'''β² '''A''' '''x''' is called a [[quadratic form]] and is a [[scalar (mathematics)|scalar]]: <math display=block>\mathbf{x}'\mathbf{A}\mathbf{x} = \sum_{i,j}a_{ij} x_i x_j</math> In other words, it sums up all possible combinations of products of pairs of elements from '''x''', with a separate coefficient for each. In addition, since <math display=inline>x_i x_j = x_j x_i</math>, only the sum <math display=inline>a_{ij} + a_{ji}</math> matters for any off-diagonal elements of '''A''', and there is no loss of generality in assuming that '''A''' is [[symmetric matrix|symmetric]]. Furthermore, if '''A''' is symmetric, then the form <math display=inline>\mathbf{x}'\mathbf{A}\mathbf{y} = \mathbf{y}'\mathbf{A}\mathbf{x}.</math> ==== Sum of differences from the mean ==== Another useful formula is as follows: <math display=block>\sum_{i=1}^n (x_i-\mu)^2 = \sum_{i=1}^n (x_i-\bar{x})^2 + n(\bar{x} -\mu)^2</math> where <math display=inline>\bar{x} = \frac{1}{n} \sum_{i=1}^n x_i.</math>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)