Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Multinomial distribution
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
In [[probability theory]], the '''multinomial distribution''' is a generalization of the [[binomial distribution]]. For example, it models the probability of counts for each side of a ''k''-sided die rolled ''n'' times. For ''n'' [[statistical independence|independent]] trials each of which leads to a success for exactly one of ''k'' categories, with each category having a given fixed success probability, the multinomial distribution gives the probability of any particular combination of numbers of successes for the various categories. When ''k'' is 2 and ''n'' is 1, the multinomial distribution is the [[Bernoulli distribution]]. When ''k'' is 2 and ''n'' is bigger than 1, it is the [[binomial distribution]]. When ''k'' is bigger than 2 and ''n'' is 1, it is the [[categorical distribution]]. The term "multinoulli" is sometimes used for the categorical distribution to emphasize this four-way relationship (so ''n'' determines the suffix, and ''k'' the prefix). The [[Bernoulli distribution]] models the outcome of a single [[Bernoulli trial]]. In other words, it models whether flipping a (possibly [[Fair coin|biased]]) coin one time will result in either a success (obtaining a head) or failure (obtaining a tail). The [[binomial distribution]] generalizes this to the number of heads from performing ''n'' independent flips (Bernoulli trials) of the same coin. The multinomial distribution models the outcome of ''n'' experiments, where the outcome of each trial has a [[categorical distribution]], such as rolling a (possibly [[Loaded dice|biased]]) ''k''-sided die ''n'' times. Let ''k'' be a fixed finite number. Mathematically, we have ''k'' possible mutually exclusive outcomes, with corresponding probabilities ''p''<sub>1</sub>, ..., ''p''<sub>''k''</sub>, and ''n'' independent trials. Since the ''k'' outcomes are mutually exclusive and one must occur we have ''p''<sub>''i''</sub> β₯ 0 for ''i'' = 1, ..., ''k'' and <math>\sum_{i=1}^k p_i = 1</math>. Then if the random variables ''X''<sub>''i''</sub> indicate the number of times outcome number ''i'' is observed over the ''n'' trials, the vector ''X'' = (''X''<sub>1</sub>, ..., ''X''<sub>''k''</sub>) follows a multinomial distribution with parameters ''n'' and '''p''', where '''p''' = (''p''<sub>1</sub>, ..., ''p''<sub>''k''</sub>). While the trials are independent, their outcomes ''X''<sub>''i''</sub> are dependent because they must sum to n. {{short description|Generalization of the binomial distribution}} {{Probability distribution |pdf_image = |cdf_image = |name = Multinomial Distribution |type = mass |parameters = <math>n \in \{0, 1, 2, \ldots\}</math> number of trials<br /> <math>k > 0</math> number of mutually exclusive events (integer)<br /> <math>p_1, \ldots, p_k</math> event probabilities, where <math>p_1 + \dots + p_k = 1</math> |support = <math>\left\lbrace (x_1, \dots, x_k) \, \Big\vert\, \sum_{i=1}^k x_i = n, x_i \ge 0\ (i=1,\dots,k) \right\rbrace </math> |pdf = <math>\frac{n!}{x_1!\cdots x_k!} p_1^{x_1} \cdots p_k^{x_k}</math> |cdf = |mean = <math>\operatorname E(X_i) = np_i</math> |median = |mode = |variance = <math>\operatorname{Var}(X_i) = n p_i (1-p_i)</math><br /><math>\operatorname{Cov}(X_i,X_j) = - n p_i p_j~~(i\neq j)</math> |skewness = |kurtosis = |mgf = <math>\biggl( \sum_{i=1}^k p_i e^{t_i} \biggr)^n</math> |char = <math> \left(\sum_{j=1}^k p_je^{it_j}\right)^n</math> where <math>i^2= -1</math> |pgf = <math>\biggl( \sum_{i=1}^k p_i z_i \biggr)^n</math> for <math>(z_1,\ldots,z_k)\in\mathbb{C}^k</math> <!-- invalid parameter |conjugate = [[Dirichlet distribution|Dirichlet]]: <math>\mathrm{Dir}(\alpha+\beta)</math> --> |entropy = <math> -\log(n!) - n\sum_{i=1}^kp_i\log(p_i)+\sum_{i=1}^k\sum_{x_i=0}^n\binom{n}{x_i}p_i^{x_i}(1-p_i)^{n-x_i}\log(x_i!)</math> }}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)