Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Expected value
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Uses and applications== The expectation of a random variable plays an important role in a variety of contexts. In [[statistics]], where one seeks [[Estimator|estimates]] for unknown [[Statistical parameter|parameters]] based on available data gained from [[Sampling (statistics)|samples]], the [[sample mean]] serves as an estimate for the expectation, and is itself a random variable. In such settings, the sample mean is considered to meet the desirable criterion for a "good" estimator in being [[Bias of an estimator|unbiased]]; that is, the expected value of the estimate is equal to the [[true value]] of the underlying parameter. {{See also|Estimation theory}} For a different example, in [[decision theory]], an agent making an optimal choice in the context of incomplete information is often assumed to maximize the expected value of their [[von Neumann–Morgenstern utility function|utility function]]. It is possible to construct an expected value equal to the probability of an event by taking the expectation of an [[indicator function]] that is one if the event has occurred and zero otherwise. This relationship can be used to translate properties of expected values into properties of probabilities, e.g. using the [[law of large numbers]] to justify estimating probabilities by [[Frequency (statistics)|frequencies]]. The expected values of the powers of ''X'' are called the [[moment (mathematics)|moments]] of ''X''; the [[moment about the mean|moments about the mean]] of ''X'' are expected values of powers of {{math|''X'' − E[''X'']}}. The moments of some random variables can be used to specify their distributions, via their [[moment generating function]]s. To empirically estimate the expected value of a random variable, one repeatedly measures observations of the variable and computes the [[arithmetic mean]] of the results. If the expected value exists, this procedure estimates the true expected value in an unbiased manner and has the property of minimizing the sum of the squares of the [[errors and residuals in statistics|residuals]] (the sum of the squared differences between the observations and the estimate). The law of large numbers demonstrates (under fairly mild conditions) that, as the [[Sample size|size]] of the sample gets larger, the [[variance]] of this estimate gets smaller. This property is often exploited in a wide variety of applications, including general problems of [[Estimation theory|statistical estimation]] and [[machine learning]], to estimate (probabilistic) quantities of interest via [[Monte Carlo methods]], since most quantities of interest can be written in terms of expectation, e.g. <math>\operatorname{P}({X \in \mathcal{A}}) = \operatorname{E}[{\mathbf 1}_{\mathcal{A}}],</math> where <math>{\mathbf 1}_{\mathcal{A}}</math> is the indicator function of the set <math>\mathcal{A}.</math> [[File:Beta first moment.svg|thumb|The mass of probability distribution is balanced at the expected value, here a Beta(α,β) distribution with expected value α/(α+β).]] In [[classical mechanics]], the [[center of mass]] is an analogous concept to expectation. For example, suppose ''X'' is a discrete random variable with values ''x<sub>i</sub>'' and corresponding probabilities ''p<sub>i</sub>.'' Now consider a weightless rod on which are placed weights, at locations ''x<sub>i</sub>'' along the rod and having masses ''p<sub>i</sub>'' (whose sum is one). The point at which the rod balances is E[''X'']. Expected values can also be used to compute the variance, by means of the computational formula for the variance <math display="block">\operatorname{Var}(X)= \operatorname{E}[X^2] - (\operatorname{E}[X])^2.</math> A very important application of the expectation value is in the field of [[quantum mechanics]]. The [[Expectation value (quantum mechanics)|expectation value of a quantum mechanical operator]] <math>\hat{A}</math> operating on a [[quantum state]] vector <math>|\psi\rangle</math> is written as <math>\langle\hat{A}\rangle = \langle\psi|\hat{A}|\psi\rangle.</math> The [[uncertainty principle|uncertainty]] in <math>\hat{A}</math> can be calculated by the formula <math>(\Delta A)^2 = \langle\hat{A}^2\rangle - \langle \hat{A} \rangle^2</math>.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)