Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Algebra of random variables
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Mathematical technique}} {{More footnotes|date=November 2010}} In [[statistics]], the '''algebra of random variables''' provides rules for the [[Symbolic computation|symbolic manipulation]] of [[random variable]]s, while avoiding delving too deeply into the mathematically sophisticated ideas of [[probability theory]]. Its symbolism allows the treatment of sums, products, ratios and general functions of random variables, as well as dealing with operations such as finding the [[probability distribution]]s and the [[expected values|expectations]] (or expected values), [[variance]]s and [[covariance]]s of such combinations. In principle, the [[elementary algebra]] of random variables is equivalent to that of conventional non-random (or deterministic) variables. However, the changes occurring on the probability distribution of a random variable obtained after performing [[algebraic operation]]s are not straightforward. Therefore, the behavior of the different operators of the probability distribution, such as expected values, variances, covariances, and [[Moment (mathematics)|moments]], may be different from that observed for the random variable using symbolic algebra. It is possible to identify some key rules for each of those operators, resulting in different types of algebra for random variables, apart from the elementary symbolic algebra: Expectation algebra, Variance algebra, Covariance algebra, Moment algebra, etc. == Elementary symbolic algebra of random variables == Considering two random variables <math>X</math> and <math>Y</math>, the following algebraic operations are possible: * [[Addition]]: <math>Z = X + Y = Y + X</math> * [[Subtraction]]: <math>Z = X - Y = - Y + X</math> * [[Multiplication]]: <math>Z = X Y = Y X</math> * [[Division (mathematics)|Division]]: Suppose <math> Y \neq 0 </math>, <math>Z = X / Y = X \cdot (1/Y) = (1/Y) \cdot X</math>. * [[Exponentiation]]: <math>Z = X^Y = e^{Y\ln(X)}</math> In all cases, the variable <math>Z</math> resulting from each operation is also a random variable. All [[Commutative property|commutative]] and [[Associative property|associative]] properties of conventional algebraic operations are also valid for random variables. If any of the random variables is replaced by a deterministic variable or by a constant value, all the previous properties remain valid. == Expectation algebra for random variables == The expected value <math>\operatorname{E}[Z]</math> of the random variable <math>Z</math> resulting from an algebraic operation between two random variables can be calculated using the following set of rules: * [[Addition]]: <math>\operatorname{E}[Z] = \operatorname{E}[X+Y] = \operatorname{E}[X] + \operatorname{E}[Y] = \operatorname{E}[Y] + \operatorname{E}[X]</math> * [[Subtraction]]: <math>\operatorname{E}[Z] = \operatorname{E}[X-Y] = \operatorname{E}[X] - \operatorname{E}[Y] = -\operatorname{E}[Y] + \operatorname{E}[X]</math> * [[Multiplication]]: <math>\operatorname{E}[Z] = \operatorname{E}[X Y] = \operatorname{E}[YX]</math>. Particularly, if <math>X</math> and <math>Y</math> are [[Independence (probability theory)|independent]] from each other, then: <math>\operatorname{E}[X Y] = \operatorname{E}[X] \cdot \operatorname{E}[Y] = \operatorname{E}[Y] \cdot \operatorname{E}[X]</math>. * [[Division (mathematics)|Division]]: <math>\operatorname{E}[Z] = \operatorname{E}[X/Y] = \operatorname{E}[X \cdot (1/Y)] = \operatorname{E}[(1/Y) \cdot X]</math>. Particularly, if <math>X</math> and <math>Y</math> are independent from each other, then: <math>\operatorname{E}[X/Y] = \operatorname{E}[X] \cdot \operatorname{E}[1/Y] = \operatorname{E}[1/Y] \cdot \operatorname{E}[X]</math>. * [[Exponentiation]]: <math>\operatorname{E}[Z] = \operatorname{E}[X^Y] = \operatorname{E}[e^{Y\ln(X)}]</math> If any of the random variables is replaced by a deterministic variable or by a constant value (<math>k</math>), the previous properties remain valid considering that <math>\Pr(X = k) = 1</math> and, therefore, <math>\operatorname{E}[X] = k</math>. If <math>Z</math> is defined as a general non-linear algebraic function <math>f</math> of a random variable <math>X</math>, then: <math display="block">\operatorname{E}[Z] = \operatorname{E}[f(X)] \neq f(\operatorname{E}[X])</math> Some examples of this property include: * <math>\operatorname{E}[X^2] \neq \operatorname{E}[X]^2</math> * <math>\operatorname{E}[1/X] \neq 1/\operatorname{E}[X]</math> * <math>\operatorname{E}[e^X] \neq e^{\operatorname{E}[X]}</math> * <math>\operatorname{E}[\ln(X)] \neq \ln(\operatorname{E}[X])</math> The exact value of the expectation of the non-linear function will depend on the particular probability distribution of the random variable <math>X</math>. == Variance algebra for random variables == The variance <math>\operatorname{Var}[Z]</math> of the random variable <math>Z</math> resulting from an algebraic operation between random variables can be calculated using the following set of rules: * [[Addition]]: <math display="block">\operatorname{Var}[Z] = \operatorname{Var}[X+Y] = \operatorname{Var}[X] + 2 \operatorname{Cov}[X,Y] + \operatorname{Var}[Y].</math>Particularly, if <math>X</math> and <math>Y</math> are [[Independence (probability theory)|independent]] from each other, then: <math display="block">\operatorname{Var}[X+Y] = \operatorname{Var}[X] + \operatorname{Var}[Y].</math> * [[Subtraction]]: <math display="block">\operatorname{Var}[Z] = \operatorname{Var}[X-Y] = \operatorname{Var}[X] - 2 \operatorname{Cov}[X,Y] + \operatorname{Var}[Y].</math>Particularly, if <math>X</math> and <math>Y</math> are independent from each other, then: <math display="block">\operatorname{Var}[X-Y] = \operatorname{Var}[X] + \operatorname{Var}[Y].</math> That is, for [[independent random variables]] the variance is the same for additions and subtractions: <math display="block">\operatorname{Var}[X+Y] = \operatorname{Var}[X-Y] = \operatorname{Var}[Y-X] = \operatorname{Var}[-X-Y].</math> * [[Multiplication]]: <math display="block">\operatorname{Var}[Z] = \operatorname{Var}[XY] = \operatorname{Var}[YX].</math> Particularly, if <math>X</math> and <math>Y</math> are independent from each other, then: <math display="block">\begin{align} \operatorname{Var}[XY] &= \operatorname{E}[X^2] \cdot \operatorname{E}[Y^2] - {\left(\operatorname{E}[X] \cdot \operatorname{E}[Y]\right)}^2 \\[2pt] &= \operatorname{Var}[X] \cdot \operatorname{Var}[Y] + \operatorname{Var}[X] \cdot {\left(\operatorname{E}[Y]\right)}^2 + \operatorname{Var}[Y] \cdot {\left(\operatorname{E}[X]\right)}^2. \end{align}</math> * [[Division (mathematics)|Division]]: <math display="block">\operatorname{Var}[Z] = \operatorname{Var}[X/Y] = \operatorname{Var}[X \cdot (1/Y)] = \operatorname{Var}[(1/Y) \cdot X].</math> Particularly, if <math>X</math> and <math>Y</math> are independent from each other, then: <math display="block">\begin{align} \operatorname{Var}[X/Y] &= \operatorname{E}[X^2] \cdot \operatorname{E}[1/Y^2] - {\left(\operatorname{E}[X] \cdot \operatorname{E}[1/Y]\right)}^2 \\[2pt] &= \operatorname{Var}[X] \cdot \operatorname{Var}[1/Y] + \operatorname{Var}[X] \cdot {\left(\operatorname{E}[1/Y]\right)}^2 + \operatorname{Var}[1/Y] \cdot {\left(\operatorname{E}[X]\right)}^2. \end{align}</math> * [[Exponentiation]]: <math display="block">\operatorname{Var}[Z] = \operatorname{Var}[X^Y] = \operatorname{Var}[e^{Y\ln(X)}]</math> where <math>\operatorname{Cov}[X,Y] = \operatorname{Cov}[Y,X]</math> represents the covariance operator between random variables <math>X</math> and <math>Y</math>. The variance of a random variable can also be expressed directly in terms of the covariance or in terms of the expected value: <math display="block">\operatorname{Var}[X] = \operatorname{Cov}(X,X) = \operatorname{E}[X^2] - \operatorname{E}[X]^2</math> If any of the random variables is replaced by a deterministic variable or by a constant value (<math>k</math>), the previous properties remain valid considering that <math>\Pr(X = k) = 1</math> and <math>\operatorname{E}[X] = k</math>, <math>\operatorname{Var}[X] = 0</math> and <math>\operatorname{Cov}[Y,k] = 0</math>. Special cases are the addition and multiplication of a random variable with a deterministic variable or a constant, where: * <math>\operatorname{Var}[k+Y] = \operatorname{Var}[Y]</math> * <math>\operatorname{Var}[kY] = k^2 \operatorname{Var}[Y]</math> If <math>Z</math> is defined as a general non-linear algebraic function <math>f</math> of a random variable <math>X</math>, then: <math display="block">\operatorname{Var}[Z] = \operatorname{Var}[f(X)] \neq f(\operatorname{Var}[X])</math> The exact value of the variance of the non-linear function will depend on the particular probability distribution of the random variable <math>X</math>. == Covariance algebra for random variables == The covariance (<math>\operatorname{Cov}[Z,X]</math>) between the random variable <math>Z</math> resulting from an algebraic operation and the random variable <math>X</math> can be calculated using the following set of rules: *[[Addition]]: <math display="block">\operatorname{Cov}[Z,X] = \operatorname{Cov}[X+Y,X] = \operatorname{Var}[X] + \operatorname{Cov}[X,Y].</math> If <math>X</math> and <math>Y</math> are [[Independence (probability theory)|independent]] from each other, then: <math display="block">\operatorname{Cov}[X+Y,X] = \operatorname{Var}[X].</math> *[[Subtraction]]: <math display="block">\operatorname{Cov}[Z,X] = \operatorname{Cov}[X-Y,X] = \operatorname{Var}[X] - \operatorname{Cov}[X,Y].</math> If <math>X</math> and <math>Y</math> are independent from each other, then: <math display="block">\operatorname{Cov}[X-Y,X] = \operatorname{Var}[X].</math> *[[Multiplication]]: <math display="block">\operatorname{Cov}[Z,X] = \operatorname{Cov}[XY,X] = \operatorname{E}[X^2Y] - \operatorname{E}[XY] \operatorname{E}[X].</math> If <math>X</math> and <math>Y</math> are independent from each other, then: <math display="block">\operatorname{Cov}[XY,X] = \operatorname{Var}[X] \cdot \operatorname{E}[Y].</math> *[[Division (mathematics)|Division]] (covariance with respect to the numerator): <math display="block">\operatorname{Cov}[Z,X] = \operatorname{Cov}[X/Y,X] = \operatorname{E}[X^2/Y] - \operatorname{E}[X/Y] \operatorname{E}[X].</math> If <math>X</math> and <math>Y</math> are independent from each other, then: <math display="block">\operatorname{Cov}[X/Y,X] = \operatorname{Var}[X] \cdot \operatorname{E}[1/Y].</math> *[[Division (mathematics)|Division]] (covariance with respect to the denominator): <math display="block">\operatorname{Cov}[Z,X] = \operatorname{Cov}[Y/X,X] = \operatorname{E}[Y] - \operatorname{E}[Y/X] \operatorname{E}[X].</math> If <math>X</math> and <math>Y</math> are independent from each other, then: <math display="block">\operatorname{Cov}[Y/X,X] = \operatorname{E}[Y] \cdot (1-\operatorname{E}[X] \cdot \operatorname{E}[1/X]).</math> *[[Exponentiation]] (covariance with respect to the base): <math display="block">\operatorname{Cov}[Z,X] = \operatorname{Cov}[X^Y,X] = \operatorname{E}[X^{Y+1}]-\operatorname{E}[X^Y] \operatorname{E}[X].</math> *[[Exponentiation]] (covariance with respect to the power): <math display="block">\operatorname{Cov}[Z,X] = \operatorname{Cov}[Y^X,X] = \operatorname{E}[XY^X]-\operatorname{E}[Y^X] \operatorname{E}[X].</math> The covariance of a random variable can also be expressed directly in terms of the expected value: <math display="block">\operatorname{Cov}(X,Y) = \operatorname{E}[XY] - \operatorname{E}[X]\operatorname{E}[Y]</math> If any of the random variables is replaced by a deterministic variable or by a constant value {{nowrap|(<math>k</math>),}} the previous properties remain valid considering that {{nowrap|<math>\operatorname{E}[k] = k</math>,}} <math>\operatorname{Var}[k] = 0</math> and {{nowrap|<math>\operatorname{Cov}[X,k]=0</math>.}} If <math>Z</math> is defined as a general non-linear algebraic function <math>f</math> of a random variable <math>X</math>, then: <math display="block">\operatorname{Cov}[Z,X] = \operatorname{Cov}[f(X),X] = \operatorname{E}[Xf(X)] - \operatorname{E}[f(X)] \operatorname{E}[X]</math> The exact value of the covariance of the non-linear function will depend on the particular probability distribution of the random variable <math>X</math>. == Approximations by Taylor series expansions of moments == If the [[Moment (mathematics)|moments]] of a certain random variable <math>X</math> are known (or can be determined by integration if the [[probability density function]] is known), then it is possible to approximate the expected value of any general non-linear function <math>f(X)</math> as a [[Taylor expansions for the moments of functions of random variables|Taylor series expansion of the moments]], as follows: <math display="block">f(X) = \sum_{n=0}^\infty \frac{1}{n!} \left(\frac{d^n f}{dX^n}\right)_{X=\mu} {\left(X - \mu\right)}^n,</math> where <math>\mu = \operatorname{E}[X]</math> is the mean value of <math>X</math>. <math display="block">\begin{align} \operatorname{E}[f(X)] &= \operatorname{E}\left[ \sum_{n=0}^\infty \frac{1}{n!}\left({d^nf \over dX^n}\right)_{X=\mu} {\left(X-\mu\right)}^n\right] \\ &= \sum_{n=0}^\infty \frac{1}{n!}\left(\frac{d^n f}{dX^n}\right)_{X=\mu} \operatorname{E}\left[{\left(X - \mu\right)}^n\right] \\ &= \sum_{n=0}^\infty \frac{1}{n!}\left({d^nf \over dX^n}\right)_{X=\mu}\mu_n(X), \end{align}</math> where <math>\mu_n(X) = \operatorname{E}[(X-\mu)^n]</math> is the ''n''-th moment of <math>X</math> about its mean. Note that by their definition, <math>\mu_0(X)=1</math> and <math>\mu_1(X)=0</math>. The first order term always vanishes but was kept to obtain a closed form expression. Then, <math display="block">\operatorname{E}[f(X)] \approx \sum_{n=0}^{n_{\max}} \frac{1}{n!} \left(\frac{d^nf}{dX^n}\right)_{X=\mu}\mu_n(X), </math> where the Taylor expansion is truncated after the <math>n_{\max} </math>-th moment. Particularly for functions of [[normal random variable]]s, it is possible to obtain a Taylor expansion in terms of the [[standard normal distribution]]:<ref>{{Cite journal|last=Hernandez|first=Hugo|date=2016|title=Modelling the effect of fluctuation in nonlinear systems using variance algebra - Application to light scattering of ideal gases|journal=ForsChem Research Reports|language=en|volume=2016-1|doi=10.13140/rg.2.2.36501.52969}}</ref> <math display="block">f(X) = \sum_{n=0}^\infty \frac{\sigma^n}{n!} \left(\frac{d^n f}{dX^n}\right)_{X=\mu} \mu_n(Z),</math>where <math>X \sim N(\mu,\sigma ^2)</math> is a normal random variable, and <math>Z\sim N(0,1)</math> is the standard normal distribution. Thus, <math display="block">\operatorname{E}[f(X)]\approx \sum_{n=0}^{n_{\max}} {\sigma ^n \over n!} \left({d^nf \over dX^n}\right)_{X=\mu} \mu_n(Z) , </math> where the moments of the standard normal distribution are given by: <math display="block">\mu_n(Z) = \begin{cases} \prod_{i=1}^{n/2}(2i-1), & \text{if } n \text{ is even} \\ 0, & \text{if }n\text{ is odd} \end{cases}</math> Similarly for normal random variables, it is also possible to approximate the variance of the non-linear function as a Taylor series expansion as: <math display="block">\operatorname{Var}[f(X)] \approx \sum_{n=1}^{n_{\max}} \left({\sigma^n \over n!} \left({d^nf \over dX^n}\right)_{X=\mu}\right)^2 \operatorname{Var}[Z^n] + \sum_{n=1}^{n_{\max}} \sum_{m \neq n} \frac{\sigma^{n+m}}{{n!m!}} \left({d^nf \over dX^n}\right)_{X=\mu} \left({d^mf \over dX^m}\right)_{X=\mu} \operatorname{Cov}[Z^n,Z^m],</math> where <math display="block">\operatorname{Var}[Z^n] = \begin{cases} \prod_{i=1}^{n}(2i-1) -\prod_{i=1}^{n/2}(2i-1)^2, & \text{if }n\text{ is even} \\ \prod_{i=1}^{n}(2i-1), & \text{if }n\text{ is odd}, \end{cases}</math> and <math display="block">\operatorname{Cov}[Z^n,Z^m] = \begin{cases} \prod_{i=1}^{(n+m)/2}(2i-1) -\prod_{i=1}^{n/2}(2i-1) \prod_{j=1}^{m/2}(2j-1), & \text{if }n\text{ and }m \text{ are even} \\ \prod_{i=1}^{(n+m)/2}(2i-1), & \text{if }n\text{ and }m\text{ are odd} \\ 0, & \text{otherwise} \end{cases}</math> ==Algebra of complex random variables== In the [[algebra]]ic [[axiom]]atization of [[probability theory]], the primary concept is not that of probability of an event, but rather that of a [[random variable]]. [[Probability distribution]]s are determined by assigning an [[expected value|expectation]] to each random variable. The [[measure (mathematics)|measurable space]] and the probability measure arise from the random variables and expectations by means of well-known [[representation theorem]]s of analysis. One of the important features of the algebraic approach is that apparently infinite-dimensional probability distributions are not harder to formalize than finite-dimensional ones. Random variables are assumed to have the following properties: # [[complex number|complex]] constants are possible [[Realization (probability)|realizations]] of a random variable; # the sum of two random variables is a random variable; # the product of two random variables is a random variable; # addition and multiplication of random variables are both [[commutative]]; and # there is a notion of conjugation of random variables, satisfying {{math|1=(''XY'')<sup>*</sup> = ''Y''<sup>*</sup>''X''<sup>*</sup>}} and {{math|1=''X''<sup>**</sup> = ''X''}} for all random variables {{math|''X'',''Y''}} and coinciding with complex conjugation if {{math|''X''}} is a constant. This means that random variables form complex commutative [[*-algebra]]s. If {{math|1=''X'' = ''X''<sup>*</sup>}} then the random variable {{math|''X''}} is called "real". An expectation {{math|E}} on an algebra {{math|''A''}} of random variables is a normalized, positive [[linear functional]]. What this means is that # {{math|1=E[''k''] = ''k''}} where {{math|''k''}} is a constant; # {{math|E[''X''<sup>*</sup>''X''] β₯ 0}} for all random variables {{math|''X''}}; # {{math|1=E[''X'' + ''Y''] = E[''X''] + E[''Y'']}} for all random variables {{math|''X''}} and {{math|''Y''}}; and # {{math|1=E[''kX''] = ''k''E[''X'']}} if {{math|''k''}} is a constant. One may generalize this setup, allowing the algebra to be noncommutative. This leads to other areas of noncommutative probability such as [[quantum probability]], [[random matrix theory]], and [[free probability]]. ==See also== *[[Relationships among probability distributions]] *[[Ratio distribution]] **[[Cauchy distribution]] **[[Slash distribution]] * [[Inverse distribution]] * [[Product distribution]] * [[Mellin transform]] * [[Sum of normally distributed random variables]] * [[List of convolutions of probability distributions]] β the [[probability measure]] of the sum of [[independent random variables]] is the [[Convolution#Measures|convolution]] of their probability measures. * [[Law of total expectation]] * [[Law of total variance]] * [[Law of total covariance]] * [[Law of total cumulance]] *[[Taylor expansions for the moments of functions of random variables]] *[[Delta method]] {{more footnotes|date=April 2013}} == References == <references /> ==Further reading== *{{cite book |last=Whittle |first=Peter |author-link=Peter Whittle (mathematician) |title=Probability via Expectation |year=2000 |publisher=Springer |location=New York, NY |isbn=978-0-387-98955-6 |url=https://www.springer.com/statistics/book/978-0-387-98955-6 |edition=4th |access-date=24 September 2012}} *{{Cite book |last=Springer |first=Melvin Dale |title=The Algebra of Random Variables |url=https://books.google.com/books?id=qUDvAAAAMAAJ |access-date=24 September 2012 |publisher=[[John Wiley & Sons|Wiley]] |year=1979 |isbn=0-471-01406-0}} *{{SpringerEOM|id=Measure_algebra_(measure_theory)|title=Measure algebra}} {{DEFAULTSORT:Algebra Of Random Variables}} [[Category:Algebra of random variables| ]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Math
(
edit
)
Template:More footnotes
(
edit
)
Template:N!m!
(
edit
)
Template:Nowrap
(
edit
)
Template:Short description
(
edit
)
Template:SpringerEOM
(
edit
)