Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
F-distribution
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Continuous probability distribution}} {{About|the central F-distribution|the generalized distribution|noncentral F-distribution|other uses|F-ratio (disambiguation){{!}}F-ratio}} {{distinguish|text=[[F-statistics|''F''-statistics]] as used in population genetics}} {{DISPLAYTITLE:''F''-distribution}} {{Probability distribution | name = Fisher–Snedecor | type = density | pdf_image = [[Image:F-distribution pdf.svg|325px]]| | cdf_image = [[Image:F_dist_cdf.svg|325px]]| | parameters = ''d''<sub>1</sub>, ''d''<sub>2</sub> > 0 deg. of freedom | support = <math>x \in (0, +\infty)\;</math> if <math>d_1 = 1</math>, otherwise <math>x \in [0, +\infty)\;</math> | pdf = <math>\frac{\sqrt{\frac{(d_1 x)^{d_1} d_2^{d_2}}{(d_1 x+d_2)^{d_1+d_2}}}}{x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)}\!</math> | cdf = <math>I_{\frac{d_1 x}{d_1 x + d_2}} \left(\tfrac{d_1}{2}, \tfrac{d_2}{2} \right)</math> | mean = <math>\frac{d_2}{d_2-2}\!</math><br /> for ''d''<sub>2</sub> > 2 | median = | mode = <math>\frac{d_1-2}{d_1}\;\frac{d_2}{d_2+2}</math><br /> for ''d''<sub>1</sub> > 2 | variance = <math>\frac{2\,d_2^2\,(d_1+d_2-2)}{d_1 (d_2-2)^2 (d_2-4)}\!</math><br /> for ''d''<sub>2</sub> > 4 | skewness = <math>\frac{(2 d_1 + d_2 - 2) \sqrt{8 (d_2-4)}}{(d_2-6) \sqrt{d_1 (d_1 + d_2 -2)}}\!</math><br />for ''d''<sub>2</sub> > 6 | kurtosis = ''see text'' | entropy = <math>\begin{align} & \ln \Gamma{\left(\tfrac{d_1}{2} \right)} + \ln \Gamma{\left(\tfrac{d_2}{2} \right)} - \ln \Gamma{\left(\tfrac{d_1+d_2}{2} \right)} \\ &+ \left(1-\tfrac{d_1}{2} \right) \psi{\left(1+\tfrac{d_1}{2} \right)} - \left(1+\tfrac{d_2}{2} \right) \psi{\left(1+\tfrac{d_2}{2} \right)} \\ &+ \left(\tfrac{d_1 + d_2}{2} \right) \psi{\left(\tfrac{d_1 + d_2}{2} \right)} + \ln \frac{d_2}{d_1} \end{align}</math><ref name=lazo1978entropy>{{Cite journal|last1=Lazo |first1=A.V. |last2=Rathie |first2=P. |title=On the entropy of continuous probability distributions |journal=IEEE Transactions on Information Theory |volume=24 |number=1 |pages=120–122 |year=1978 |publisher=IEEE |doi=10.1109/tit.1978.1055832}}</ref> | mgf = ''does not exist, raw moments defined in text and in <ref name=johnson /><ref name=abramowitz />'' | char = ''see text'' }} In [[probability theory]] and [[statistics]], the '''''F''-distribution''' or '''''F''-ratio''', also known as '''Snedecor's ''F'' distribution''' or the '''Fisher–Snedecor distribution''' (after [[Ronald Fisher]] and [[George W. Snedecor]]), is a [[continuous probability distribution]] that arises frequently as the [[null distribution]] of a [[test statistic]], most notably in the [[analysis of variance]] (ANOVA) and other [[F-test|''F''-tests]].<ref name=johnson>{{cite book | last = Johnson | first = Norman Lloyd | author2 = Samuel Kotz | author3 = N. Balakrishnan | title = Continuous Univariate Distributions, Volume 2 (Section 27) | edition = 2nd | publisher = Wiley | year = 1995 | isbn = 0-471-58494-0}}</ref><ref name=abramowitz>{{Abramowitz_Stegun_ref|26|946}}</ref><ref>NIST (2006). [http://www.itl.nist.gov/div898/handbook/eda/section3/eda3665.htm Engineering Statistics Handbook – F Distribution]</ref><ref>{{cite book | last = Mood | first = Alexander | author2 = Franklin A. Graybill | author3 = Duane C. Boes | title = Introduction to the Theory of Statistics | edition = Third | pages = 246–249 | publisher = McGraw-Hill | year = 1974 | isbn = 0-07-042864-6}}</ref> ==Definitions== The ''F''-distribution with ''d''<sub>1</sub> and ''d''<sub>2</sub> degrees of freedom is the distribution of <math display="block"> X = \frac{U_1/d_1}{U_2/d_2} </math> where <math display=inline>U_1</math> and <math display=inline>U_2</math> are [[Independence (probability theory)|independent]] [[random variable]]s with [[chi-square distribution]]s with respective degrees of freedom <math display=inline>d_1</math> and <math display=inline>d_2</math>. It can be shown to follow that the [[probability density function]] (pdf) for ''X'' is given by <math display="block">\begin{align} f(x; d_1,d_2) &= \frac{\sqrt{\frac{(d_1x)^{d_1}\,\,d_2^{d_2}} {(d_1x+d_2)^{d_1+d_2}}}} {x\operatorname{B}\left(\frac{d_1}{2},\frac{d_2}{2}\right)} \\[5pt] &=\frac{1}{\operatorname{B}\left(\frac{d_1}{2},\frac{d_2}{2}\right)} \left(\frac{d_1}{d_2}\right)^{\frac{d_1}{2}} x^{\frac{d_1}{2} - 1} \left(1+\frac{d_1}{d_2} \, x \right)^{-\frac{d_1+d_2}{2}} \end{align}</math> for [[real number|real]] ''x'' > 0. Here <math>\mathrm{B}</math> is the [[beta function]]. In many applications, the parameters ''d''<sub>1</sub> and ''d''<sub>2</sub> are [[positive integer]]s, but the distribution is well-defined for positive real values of these parameters. The [[cumulative distribution function]] is <math display="block">F(x; d_1,d_2)=I_{d_1 x/(d_1 x + d_2)}\left (\tfrac{d_1}{2}, \tfrac{d_2}{2} \right) ,</math> where ''I'' is the [[regularized incomplete beta function]]. ==Properties== The expectation, variance, and other details about the F(''d''<sub>1</sub>, ''d''<sub>2</sub>) are given in the sidebox; for ''d''<sub>2</sub> > 8, the [[excess kurtosis]] is <math display="block">\gamma_2 = 12\frac{d_1(5d_2-22)(d_1+d_2-2)+(d_2-4)(d_2-2)^2}{d_1(d_2-6)(d_2-8)(d_1+d_2-2)}.</math> The ''k''-th moment of an F(''d''<sub>1</sub>, ''d''<sub>2</sub>) distribution exists and is finite only when 2''k'' < ''d''<sub>2</sub> and it is equal to<ref name=taboga>{{cite web | last1 = Taboga | first1 = Marco | url = http://www.statlect.com/F_distribution.htm | title = The F distribution}}</ref> <math display="block">\mu _X(k) =\left( \frac{d_2}{d_1}\right)^k \frac{\Gamma \left(\tfrac{d_1}{2}+k\right) }{\Gamma \left(\tfrac{d_1}{2}\right)} \frac{\Gamma \left(\tfrac{d_2}{2}-k\right) }{\Gamma \left( \tfrac{d_2}{2}\right) }.</math> The ''F''-distribution is a particular [[Parametrization (geometry)|parametrization]] of the [[beta prime distribution]], which is also called the beta distribution of the second kind. The [[Characteristic function (probability theory)|characteristic function]] is listed incorrectly in many standard references (e.g.,<ref name=abramowitz />). The correct expression <ref>Phillips, P. C. B. (1982) "The true characteristic function of the F distribution," ''[[Biometrika]]'', 69: 261–264 {{JSTOR|2335882}}</ref> is <math display="block">\varphi^F_{d_1, d_2}(s) = \frac{\Gamma{\left(\frac{d_1+d_2}{2}\right)}}{\Gamma{\left(\tfrac{d_2}{2}\right)}} U \! \left(\frac{d_1}{2},1-\frac{d_2}{2},-\frac{d_2}{d_1} \imath s \right)</math> where ''U''(''a'', ''b'', ''z'') is the [[confluent hypergeometric function]] of the second kind. ==Related distributions== ===Relation to the chi-squared distribution=== In instances where the ''F''-distribution is used, for example in the [[analysis of variance]], independence of <math>U_1</math> and <math>U_2</math> (defined above) might be demonstrated by applying [[Cochran's theorem]]. Equivalently, since the [[chi-squared distribution]] is the sum of squares of [[Independence (probability theory)|independent]] [[standard normal]] random variables, the random variable of the ''F''-distribution may also be written <math display="block">X = \frac{s_1^2}{\sigma_1^2} \div \frac{s_2^2}{\sigma_2^2},</math> where <math>s_1^2 = \frac{S_1^2}{d_1}</math> and <math>s_2^2 = \frac{S_2^2}{d_2}</math>, <math>S_1^2</math> is the sum of squares of <math>d_1</math> random variables from normal distribution <math>N(0,\sigma_1^2)</math> and <math>S_2^2</math> is the sum of squares of <math>d_2</math> random variables from normal distribution <math>N(0,\sigma_2^2)</math>. In a [[frequentist]] context, a scaled ''F''-distribution therefore gives the probability <math>p(s_1^2/s_2^2 \mid \sigma_1^2, \sigma_2^2)</math>, with the ''F''-distribution itself, without any scaling, applying where <math>\sigma_1^2</math> is being taken equal to <math>\sigma_2^2</math>. This is the context in which the ''F''-distribution most generally appears in [[F-test|''F''-tests]]: where the null hypothesis is that two independent normal variances are equal, and the observed sums of some appropriately selected squares are then examined to see whether their ratio is significantly incompatible with this null hypothesis. The quantity <math>X</math> has the same distribution in Bayesian statistics, if an uninformative rescaling-invariant [[Jeffreys prior]] is taken for the [[prior probability|prior probabilities]] of <math>\sigma_1^2</math> and <math>\sigma_2^2</math>.<ref>{{cite book |first=G. E. P. |last=Box |first2=G. C. |last2=Tiao |year=1973 |title=Bayesian Inference in Statistical Analysis |publisher=Addison-Wesley |page=110 |isbn=0-201-00622-7 }}</ref> In this context, a scaled ''F''-distribution thus gives the posterior probability <math>p(\sigma^2_2 /\sigma_1^2 \mid s^2_1, s^2_2)</math>, where the observed sums <math>s^2_1</math> and <math>s^2_2</math> are now taken as known. ===In general=== *If <math>X \sim \chi^2_{d_1}</math> and <math>Y \sim \chi^2_{d_2}</math> ([[Chi squared distribution]]) are [[independence (probability theory)|independent]], then <math> \frac{X / d_1}{Y / d_2} \sim \mathrm{F}(d_1, d_2)</math> *If <math>X_k \sim \Gamma(\alpha_k,\beta_k)\,</math> ([[Gamma distribution]]) are independent, then <math> \frac{\alpha_2\beta_1 X_1}{\alpha_1\beta_2 X_2} \sim \mathrm{F}(2\alpha_1, 2\alpha_2)</math> *If <math>X \sim \operatorname{Beta}(d_1/2,d_2/2)</math> ([[Beta distribution]]) then <math>\frac{d_2 X}{d_1(1-X)} \sim \operatorname{F}(d_1,d_2)</math> *Equivalently, if <math>X \sim F(d_1, d_2)</math>, then <math>\frac{d_1 X/d_2}{1+d_1 X/d_2} \sim \operatorname{Beta}(d_1/2,d_2/2)</math>. *If <math>X \sim F(d_1, d_2)</math>, then <math>\frac{d_1}{d_2}X</math> has a [[beta prime distribution]]: <math>\frac{d_1}{d_2}X \sim \operatorname{\beta^\prime}\left(\tfrac{d_1}{2},\tfrac{d_2}{2}\right)</math>. *If <math>X \sim F(d_1, d_2)</math> then <math>Y = \lim_{d_2 \to \infty} d_1 X</math> has the [[chi-squared distribution]] <math>\chi^2_{d_1}</math> *<math>F(d_1, d_2)</math> is equivalent to the scaled [[Hotelling's T-squared distribution]] <math>\frac{d_2}{d_1(d_1+d_2-1)} \operatorname{T}^2 (d_1, d_1 +d_2-1) </math>. *If <math>X \sim F(d_1, d_2)</math> then <math>X^{-1} \sim F(d_2, d_1)</math>. *If <math>X\sim t_{(n)}</math> — [[Student's t-distribution]] — then: <math display="block">\begin{align} X^{2} &\sim \operatorname{F}(1, n) \\ X^{-2} &\sim \operatorname{F}(n, 1) \end{align}</math> *''F''-distribution is a special case of type 6 [[Pearson distribution]] *If <math>X</math> and <math>Y</math> are independent, with <math>X,Y\sim</math> [[Laplace distribution|Laplace(''μ'', ''b'')]] then <math display="block"> \frac{|X-\mu|}{|Y-\mu|} \sim \operatorname{F}(2,2) </math> *If <math>X\sim F(n,m)</math> then <math>\tfrac{\log{X}}{2} \sim \operatorname{FisherZ}(n,m)</math> ([[Fisher's z-distribution]]) *The [[noncentral F-distribution|noncentral ''F''-distribution]] simplifies to the ''F''-distribution if <math>\lambda=0</math>. *The doubly [[noncentral F-distribution|noncentral ''F''-distribution]] simplifies to the ''F''-distribution if <math> \lambda_1 = \lambda_2 = 0 </math> *If <math>\operatorname{Q}_X(p)</math> is the quantile ''p'' for <math>X\sim F(d_1,d_2)</math> and <math>\operatorname{Q}_Y(1-p)</math> is the quantile <math>1-p</math> for <math>Y\sim F(d_2,d_1)</math>, then <math display="block">\operatorname{Q}_X(p)=\frac{1}{\operatorname{Q}_Y(1-p)}.</math> * ''F''-distribution is an instance of [[ratio distributions]] * [[Kendall's W|W]]-distribution<ref>{{Cite journal |last1=Mahmoudi |first1=Amin |last2=Javed |first2=Saad Ahmed |date=October 2022 |title=Probabilistic Approach to Multi-Stage Supplier Evaluation: Confidence Level Measurement in Ordinal Priority Approach |journal=Group Decision and Negotiation |language=en |volume=31 |issue=5 |pages=1051–1096 |doi=10.1007/s10726-022-09790-1 |issn=0926-2644 |pmc=9409630 |pmid=36042813}}</ref> is a unique parametrization of F-distribution. ==See also== {{Colbegin}} *[[Beta prime distribution]] *[[Chi-square distribution]] *[[Chow test]] *[[Gamma distribution]] *[[Hotelling's T-squared distribution]] *[[Wilks' lambda distribution]] *[[Wishart distribution]] *[[Modified half-normal distribution]]<ref name="Sun, Kong and Pal">{{cite journal |last1=Sun |first1=Jingchao |last2=Kong |first2=Maiying |last3=Pal |first3=Subhadip |title=The Modified-Half-Normal distribution: Properties and an efficient sampling scheme |journal=Communications in Statistics - Theory and Methods |date=22 June 2021 |volume=52 |issue=5 |pages=1591–1613 |doi=10.1080/03610926.2021.1934700 |s2cid=237919587 |url=https://figshare.com/articles/journal_contribution/The_Modified-Half-Normal_distribution_Properties_and_an_efficient_sampling_scheme/14825266/1/files/28535884.pdf |issn=0361-0926}}</ref> with the pdf on <math>(0, \infty)</math> is given as <math> f(x)= \frac{2\beta^{\frac{\alpha}{2}} x^{\alpha-1} \exp(-\beta x^2+ \gamma x )}{\Psi{\left(\frac{\alpha}{2}, \frac{ \gamma}{\sqrt{\beta}}\right)}}</math>, where <math>\Psi(\alpha,z)={}_1\Psi_1\left(\begin{matrix}\left(\alpha,\frac{1}{2}\right)\\(1,0)\end{matrix};z \right)</math> denotes the [[Fox–Wright Psi function]]. {{Colend}} ==References== {{reflist}} ==External links== *[http://www.itl.nist.gov/div898/handbook/eda/section3/eda3673.htm Table of critical values of the ''F''-distribution] *[https://mathshistory.st-andrews.ac.uk/Miller/mathword/f/ Earliest Uses of Some of the Words of Mathematics: entry on ''F''-distribution contains a brief history] *[http://www.waterlog.info/f-test.htm Free calculator for ''F''-testing] {{ProbDistributions|continuous-semi-infinite}} {{DEFAULTSORT:F-distribution}} [[Category:Continuous distributions]] [[Category:Analysis of variance]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:About
(
edit
)
Template:Abramowitz Stegun ref
(
edit
)
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite web
(
edit
)
Template:Colbegin
(
edit
)
Template:Colend
(
edit
)
Template:Distinguish
(
edit
)
Template:JSTOR
(
edit
)
Template:ProbDistributions
(
edit
)
Template:Probability distribution
(
edit
)
Template:Reflist
(
edit
)
Template:Short description
(
edit
)