Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Moment-generating function
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Calculation== The moment-generating function is the expectation of a function of the random variable, it can be written as: * For a discrete [[probability mass function]], <math>M_X(t)=\sum_{i=0}^\infty e^{tx_i}\, p_i</math> * For a continuous [[probability density function]], <math> M_X(t) = \int_{-\infty}^\infty e^{tx} f(x)\,dx </math> * In the general case: <math>M_X(t) = \int_{-\infty}^\infty e^{tx}\,dF(x)</math>, using the [[Riemann–Stieltjes integral]], and where <math>F</math> is the [[cumulative distribution function]]. This is simply the [[Laplace-Stieltjes transform]] of <math>F</math>, but with the sign of the argument reversed. Note that for the case where <math>X</math> has a continuous [[probability density function]] <math>f(x)</math>, <math>M_X(-t)</math> is the [[two-sided Laplace transform]] of <math>f(x)</math>. <math display="block">\begin{align} M_X(t) & = \int_{-\infty}^\infty e^{tx} f(x)\,dx \\[1ex] & = \int_{-\infty}^\infty \left( 1+ tx + \frac{t^2 x^2}{2!} + \cdots + \frac{t^n x^n}{n!} + \cdots\right) f(x)\,dx \\[1ex] & = 1 + tm_1 + \frac{t^2 m_2}{2!} + \cdots + \frac{t^n m_n}{n!} +\cdots, \end{align}</math> where <math>m_n</math> is the <math>n</math>th [[moment (mathematics)|moment]]. ===Linear transformations of random variables === If random variable <math>X</math> has moment generating function <math>M_X(t)</math>, then <math>\alpha X + \beta</math> has moment generating function <math>M_{\alpha X + \beta}(t) = e^{\beta t}M_X(\alpha t)</math> <math display="block"> M_{\alpha X + \beta}(t) = \operatorname{E}\left[e^{(\alpha X + \beta) t}\right] = e^{\beta t} \operatorname{E}\left[e^{\alpha Xt}\right] = e^{\beta t} M_X(\alpha t) </math> ===Linear combination of independent random variables=== If <math display="inline">S_n = \sum_{i=1}^n a_i X_i</math>, where the {{math|''X''<sub>''i''</sub>}} are independent random variables and the {{math|''a''<sub>''i''</sub>}} are constants, then the probability density function for {{math|''S''<sub>''n''</sub>}} is the [[convolution]] of the probability density functions of each of the {{math|''X''<sub>''i''</sub>}}, and the moment-generating function for {{math|''S''<sub>''n''</sub>}} is given by <math display="block"> M_{S_n}(t) = M_{X_1}(a_1t) M_{X_2}(a_2t) \cdots M_{X_n}(a_nt) \, . </math> <!---------- Below was lifted from [[generating function]] ... there should be an analog for the moment-generating functionbuted with common probability-generating function ''G''<sub>X</sub>, then <math display="block">G_{S_N}(z) = G_N(G_X(z)).</math> --------> ===Vector-valued random variables=== For [[random vector|vector-valued random variables]] <math>\mathbf X</math> with [[real number|real]] components, the moment-generating function is given by <math display="block"> M_X(\mathbf t) = \operatorname{E}\left[e^{\langle \mathbf t, \mathbf X \rangle}\right] </math> where <math>\mathbf t</math> is a vector and <math>\langle \cdot, \cdot \rangle</math> is the [[dot product]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)