Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Moment-generating function
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Definition== Let <math> X </math> be a [[random variable]] with [[Cumulative distribution function|CDF]] <math>F_X</math>. The moment generating function (mgf) of <math>X</math> (or <math>F_X</math>), denoted by <math>M_X(t)</math>, is <math display="block"> M_X(t) = \operatorname E \left[e^{tX}\right] </math> provided this [[expected value|expectation]] exists for <math>t</math> in some open [[Neighborhood (mathematics)|neighborhood]] of 0. That is, there is an <math>h > 0</math> such that for all <math>t</math> in <math>-h < 0 < h</math>, <math>\operatorname E \left[e^{tX}\right] </math> exists. If the expectation does not exist in an open neighborhood of 0, we say that the moment generating function does not exist.<ref>{{cite book |last1=Casella |first1=George|last2= Berger|first2= Roger L. |title=Statistical Inference |publisher=Wadsworth & Brooks/Cole|year=1990 |page=61 |isbn=0-534-11958-1 }}</ref> In other words, the moment-generating function of {{mvar|X}} is the [[expected value|expectation]] of the random variable <math> e^{tX}</math>. More generally, when <math>\mathbf X = ( X_1, \ldots, X_n)^{\mathrm{T}}</math>, an <math>n</math>-dimensional [[random vector]], and <math>\mathbf t</math> is a fixed vector, one uses <math>\mathbf t \cdot \mathbf X = \mathbf t^\mathrm T\mathbf X</math> instead of {{nowrap|<math>tX</math>:}} <math display="block"> M_{\mathbf X}(\mathbf t) := \operatorname E \left[e^{\mathbf t^\mathrm T\mathbf X}\right].</math> <math> M_X(0) </math> always exists and is equal to 1. However, a key problem with moment-generating functions is that moments and the moment-generating function may not exist, as the integrals need not converge absolutely. By contrast, the [[Characteristic function (probability theory)|characteristic function]] or Fourier transform always exists (because it is the integral of a bounded function on a space of finite [[measure (mathematics)|measure]]), and for some purposes may be used instead. The moment-generating function is so named because it can be used to find the moments of the distribution.<ref>{{cite book |last=Bulmer |first=M. G. |title=Principles of Statistics |publisher=Dover |year=1979 |pages=75β79 |isbn=0-486-63760-3 }}</ref> The series expansion of <math>e^{tX}</math> is <math display="block"> e^{t X} = 1 + t X + \frac{t^2 X^2}{2!} + \frac{t^3 X^3}{3!} + \cdots + \frac{t^n X^n}{n!} + \cdots. </math> Hence, <math display="block">\begin{align} M_X(t) &= \operatorname E [e^{t X}] \\[1ex] &= 1 + t \operatorname E[X] + \frac{t^2 \operatorname E[X^2]}{2!} + \frac{t^3 \operatorname E[X^3]}{3!} + \cdots + \frac{t^n\operatorname E [X^n]}{n!}+\cdots \\[1ex] & = 1 + t m_1 + \frac{t^2 m_2}{2!} + \frac{t^3 m_3}{3!} + \cdots + \frac{t^n m_n}{n!} + \cdots, \end{align}</math> where <math>m_n</math> is the {{nowrap|<math>n</math>-th}} [[moment (mathematics)|moment]]. Differentiating <math>M_X(t)</math> <math>i</math> times with respect to <math>t</math> and setting <math>t = 0</math>, we obtain the <math>i</math>-th moment about the origin, <math>m_i</math>; see {{slink|#Calculations of moments}} below. If <math>X</math> is a continuous random variable, the following relation between its moment-generating function <math>M_X(t)</math> and the [[two-sided Laplace transform]] of its probability density function <math>f_X(x)</math> holds: <math display="block">M_X(t) = \mathcal{L}\{f_X\}(-t),</math> since the PDF's two-sided Laplace transform is given as <math display="block">\mathcal{L}\{f_X\}(s) = \int_{-\infty}^\infty e^{-sx} f_X(x)\, dx,</math> and the moment-generating function's definition expands (by the [[law of the unconscious statistician]]) to <math display="block">M_X(t) = \operatorname E \left[e^{tX}\right] = \int_{-\infty}^\infty e^{tx} f_X(x)\, dx.</math> This is consistent with the characteristic function of <math>X</math> being a [[Wick rotation]] of <math>M_X(t)</math> when the moment generating function exists, as the characteristic function of a continuous random variable <math>X</math> is the [[Fourier transform]] of its probability density function <math>f_X(x)</math>, and in general when a function <math>f(x)</math> is of [[exponential order]], the Fourier transform of <math>f</math> is a Wick rotation of its two-sided Laplace transform in the region of convergence. See [[Fourier transform#Laplace transform|the relation of the Fourier and Laplace transforms]] for further information.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)