Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Marginal distribution
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Definition == === Marginal probability mass function === Given a known [[joint distribution]] of two '''discrete''' [[random variable]]s, say, {{mvar|X}} and {{mvar|Y}}, the marginal distribution of either variable β {{mvar|X}} for example β is the [[probability distribution]] of {{mvar|X}} when the values of {{mvar|Y}} are not taken into consideration. This can be calculated by summing the [[joint probability]] distribution over all values of {{mvar|Y}}. Naturally, the converse is also true: the marginal distribution can be obtained for {{mvar|Y}} by summing over the separate values of {{mvar|X}}. :<math>p_X(x_i)=\sum_{j}p(x_i,y_j)</math>, and <math> p_Y(y_j)=\sum_{i}p(x_i,y_j)</math> {| class="wikitable" style="text-align: center; margin: 1em auto;" ! {{diagonal split header|''Y''|''X''}} || ''x''<sub>1</sub> || ''x''<sub>2</sub> || ''x''<sub>3</sub> || ''x''<sub>4</sub> || ''p<sub>Y</sub>''(''y'') β |- ! ''y''<sub>1</sub> | {{sfrac|4|32}} || {{sfrac|2|32}} || {{sfrac|1|32}} || {{sfrac|1|32}} ! {{sfrac|8|32}} |- ! ''y''<sub>2</sub> | {{sfrac|3|32}} || {{sfrac|6|32}} || {{sfrac|3|32}} || {{sfrac|3|32}} ! {{sfrac|15|32}} |- ! ''y''<sub>3</sub> | {{sfrac|9|32}} || 0 || 0 || 0 ! {{sfrac|9|32}} |- ! ''p<sub>X</sub>''(''x'') β ! {{sfrac|16|32}} || {{sfrac|8|32}} || {{sfrac|4|32}} || {{sfrac|4|32}} ! {{sfrac|32|32}} |- |+ style="caption-side: bottom;" |{{nobold|Joint and marginal distributions of a pair of discrete random variables, ''X'' and ''Y'', dependent, thus having nonzero [[mutual information]] {{math|''I''(''X''; ''Y'')}}. The values of the joint distribution are in the 3Γ4 rectangle; the values of the marginal distributions are along the right and bottom margins.}} |} A '''marginal probability''' can always be written as an [[expected value]]: <math display="block">p_X(x) = \int_y p_{X \mid Y}(x \mid y) \, p_Y(y) \, \mathrm{d}y = \operatorname{E}_{Y} [p_{X \mid Y}(x \mid Y)]\;.</math> Intuitively, the marginal probability of ''X'' is computed by examining the conditional probability of ''X'' given a particular value of ''Y'', and then averaging this conditional probability over the distribution of all values of ''Y''. This follows from the definition of [[expected value]] (after applying the [[law of the unconscious statistician]]) <math display="block">\operatorname{E}_Y [f(Y)] = \int_y f(y) p_Y(y) \, \mathrm{d}y.</math> Therefore, marginalization provides the rule for the transformation of the probability distribution of a random variable ''Y'' and another random variable {{math|''X''{{=}}''g''(''Y'')}}: <math display="block">p_X(x) = \int_y p_{X \mid Y}(x \mid y) \, p_Y(y) \, \mathrm{d}y = \int_y \delta\big(x - g(y)\big) \, p_Y(y) \, \mathrm{d}y.</math> === Marginal probability density function === Given two '''continuous''' [[random variable]]s ''X'' and ''Y'' whose [[joint distribution]] is known, then the marginal [[probability density function]] can be obtained by integrating the [[joint probability]] density, {{mvar|f}}, over ''Y,'' and vice versa. That is :<math>f_X(x) = \int_{c}^{d} f(x,y) \, dy </math> :<math>f_Y(y) = \int_{a}^{b} f(x,y) \, dx </math> where <math>x\in[a,b]</math>, and <math>y\in[c,d]</math>. === Marginal cumulative distribution function === Finding the marginal [[cumulative distribution function]] from the joint cumulative distribution function is easy. Recall that: * For '''discrete [[random variable]]s''', <math display="block">F(x,y) = P(X\leq x, Y\leq y)</math> * For '''continuous random variables''', <math display="block">F(x,y) = \int_{a}^{x} \int_{c}^{y} f(x',y') \, dy' dx'</math> If ''X'' and ''Y'' jointly take values on [''a'', ''b''] Γ [''c'', ''d''] then :<math>F_X(x)=F(x,d)</math> and <math>F_Y(y)=F(b,y)</math> If ''d'' is β, then this becomes a limit <math display="inline">F_X(x) = \lim_{y \to \infty} F(x,y)</math>. Likewise for <math>F_Y(y)</math>.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)