Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Logistic distribution
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{short description|Continuous probability distribution}} {{Probability distribution| name =Logistic distribution| type =density| pdf_image =[[File:Logisticpdfunction.svg|320px|Standard logistic PDF]]| cdf_image =[[File:Logistic cdf.svg|320px|Standard logistic CDF]]| parameters =<math>\mu,</math> [[location parameter|location]] ([[real number|real]])<br /><math>s > 0,</math> [[scale parameter|scale]] (real)| support =<math>x \in (-\infty, \infty)</math>| pdf =<math>\frac{e^{-(x-\mu)/s}} {s\left(1+e^{-(x-\mu)/s}\right)^2}</math>| cdf =<math>\frac{1}{1+e^{-(x-\mu)/s}} = \frac{1 + \tanh \frac{x-\mu}{2s}}{2}</math>| quantile =<math>\mu+s \log\left(\frac{p}{1-p}\right)</math>| mean =<math>\mu</math>| median =<math>\mu</math>| mode =<math>\mu</math>| variance =<math>\frac{s^2 \pi^2}{3}</math>| skewness =<math>0</math>| kurtosis =<math>6/5</math>| entropy =<math>\ln s + 2</math>| mgf =<math>e^{\mu t}\Beta(1-st, 1+st)</math><br />for <math>t \in (-1/s,1/s)</math><br />and <math>\Beta</math> is the [[Beta function]]| char =<math>e^{it\mu}\frac{\pi st}{\sinh(\pi st)}</math> | ES =<math>\mu + \frac{sH(p)}{1-p}</math> <br /> where <math>H(p)</math> is the binary entropy function<ref name="norton">{{cite journal |last1=Norton |first1=Matthew |last2=Khokhlov |first2=Valentyn |last3=Uryasev |first3=Stan |year=2019 |title=Calculating CVaR and bPOE for common probability distributions with application to portfolio optimization and density estimation |journal=Annals of Operations Research |volume=299 |issue=1–2 |pages=1281–1315 |publisher=Springer|doi=10.1007/s10479-019-03373-1 |arxiv=1811.11301 |url=http://uryasev.ams.stonybrook.edu/wp-content/uploads/2019/10/Norton2019_CVaR_bPOE.pdf |access-date=2023-02-27 |url-status=dead |archive-url=https://web.archive.org/web/20230301065519/http://uryasev.ams.stonybrook.edu/wp-content/uploads/2019/10/Norton2019_CVaR_bPOE.pdf |archive-date= Mar 1, 2023 }}</ref> <math>H(p) = -p \ln(p) - (1-p) \ln (1-p)</math> }} In [[probability theory]] and [[statistics]], the '''logistic distribution''' is a [[continuous probability distribution]]. Its [[cumulative distribution function]] is the [[logistic function]], which appears in [[logistic regression]] and [[feedforward neural network]]s. It resembles the [[normal distribution]] in shape but has heavier tails (higher [[kurtosis]]). The logistic distribution is a special case of the [[Tukey lambda distribution]]. == Specification == === Cumulative distribution function === The logistic distribution receives its name from its [[cumulative distribution function]], which is an instance of the family of logistic functions. The cumulative distribution function of the logistic distribution is also a scaled version of the [[Hyperbolic function|hyperbolic tangent]]. :<math>F(x; \mu, s) = \frac{1}{1+e^{-(x-\mu)/s}} = \frac12 + \frac12 \operatorname{tanh} \left(\frac{x-\mu}{2s}\right).</math> In this equation {{math|''μ''}} is the [[mean]], and {{math|''s''}} is a scale parameter proportional to the [[standard deviation]]. === Probability density function === The [[probability density function]] is the [[partial derivative]] of the cumulative distribution function: : <math> \begin{align} f(x; \mu,s) & = \frac{\partial F(x; \mu, s)}{\partial x} = \frac{e^{-(x-\mu)/s}} {s\left(1+e^{-(x-\mu)/s}\right)^2} \\[4pt] & =\frac{1}{s\left(e^{(x-\mu)/(2s)}+e^{-(x-\mu)/(2s)}\right)^2} \\[4pt] & =\frac{1}{4s} \operatorname{sech}^2\left(\frac{x-\mu}{2s}\right). \end{align} </math> When the location parameter {{math|''μ''}} is 0 and the scale parameter {{math|''s''}} is 1, then the [[probability density function]] of the logistic distribution is given by : <math> \begin{align} f(x; 0,1) & = \frac{e^{-x}}{(1+e^{-x})^2} \\[4pt] & = \frac 1 {(e^{x/2} + e^{-x/2})^2} \\[5pt] & = \frac 1 4 \operatorname{sech}^2 \left(\frac x 2 \right). \end{align} </math> Because this function can be expressed in terms of the square of the [[hyperbolic function|hyperbolic secant function]] "sech", it is sometimes referred to as the ''sech-square(d) distribution''.<ref>Johnson, Kotz & Balakrishnan (1995, p.116).</ref> (See also: [[hyperbolic secant distribution]]). === Quantile function === The [[inverse function|inverse]] cumulative distribution function ([[quantile function]]) of the logistic distribution is a generalization of the [[logit]] function. Its derivative is called the quantile density function. They are defined as follows: :<math>Q(p;\mu,s) = \mu + s \ln\left(\frac{p}{1-p}\right).</math> :<math>Q'(p;s) = \frac{s}{p(1-p)}.</math> === Alternative parameterization === An alternative parameterization of the logistic distribution can be derived by expressing the scale parameter, <math>s</math>, in terms of the standard deviation, <math>\sigma</math>, using the substitution <math>s\,=\,q\,\sigma</math>, where <math>q\,=\,\sqrt{3}/{\pi}\,=\,0.551328895\ldots</math>. The alternative forms of the above functions are reasonably straightforward. == Applications == The logistic distribution—and the S-shaped pattern of its [[cumulative distribution function]] (the [[logistic function]]) and [[quantile function]] (the [[logit function]])—have been extensively used in many different areas. === Logistic regression === One of the most common applications is in [[logistic regression]], which is used for modeling [[categorical variable|categorical]] [[dependent variable]]s (e.g., yes-no choices or a choice of 3 or 4 possibilities), much as standard [[linear regression]] is used for modeling [[continuous variable]]s (e.g., income or population). Specifically, logistic regression models can be phrased as [[latent variable]] models with [[error variable]]s following a logistic distribution. This phrasing is common in the theory of [[discrete choice]] models, where the logistic distribution plays the same role in logistic regression as the [[normal distribution]] does in [[probit regression]]. Indeed, the logistic and normal distributions have a quite similar shape. However, the logistic distribution has [[heavy-tailed distribution|heavier tails]], which often increases the [[robust statistics|robustness]] of analyses based on it compared with using the normal distribution. === Physics === The PDF of this distribution has the same functional form as the derivative of the [[Fermi function]]. In the theory of electron properties in semiconductors and metals, this derivative sets the relative weight of the various electron energies in their contributions to electron transport. Those energy levels whose energies are closest to the distribution's "mean" ([[Fermi level]]) dominate processes such as electronic conduction, with some smearing induced by temperature.<ref>{{Cite book | isbn = 9780521484916 | title = The Physics of Low-dimensional Semiconductors: An Introduction | last1 = Davies | first1 = John H. | year = 1998 | publisher = Cambridge University Press }}</ref>{{rp|34}} However the pertinent ''probability'' distribution in [[Fermi–Dirac statistics]] is actually a simple [[Bernoulli distribution]], with the probability factor given by the Fermi function. The logistic distribution arises as limit distribution of a finite-velocity damped random motion described by a telegraph process in which the random times between consecutive velocity changes have independent exponential distributions with linearly increasing parameters.<ref>A. Di Crescenzo, B. Martinucci (2010) "A damped telegraph random process with logistic stationary distribution", ''[[Applied Probability Trust|J. Appl. Prob.]]'', vol. 47, pp. 84–96.</ref> === Hydrology === [[File:FitLogisticdistr.tif|thumb|250px|Fitted cumulative logistic distribution to October rainfalls using [[CumFreq]], see also [[Distribution fitting]] ]] In [[hydrology]] the distribution of long duration river discharge and rainfall (e.g., monthly and yearly totals, consisting of the sum of 30 respectively 360 daily values) is often thought to be almost normal according to the [[central limit theorem]].<ref>{{cite book|editor-last=Ritzema|editor-first=H.P.|title=Frequency and Regression Analysis|year=1994|publisher=Chapter 6 in: Drainage Principles and Applications, Publication 16, International Institute for Land Reclamation and Improvement (ILRI), Wageningen, The Netherlands|pages=[https://archive.org/details/drainageprincipl0000unse/page/175 175–224]|url=https://archive.org/details/drainageprincipl0000unse/page/175|isbn=90-70754-33-9}}</ref> The [[normal distribution]], however, needs a numeric approximation. As the logistic distribution, which can be solved analytically, is similar to the normal distribution, it can be used instead. The blue picture illustrates an example of fitting the logistic distribution to ranked October rainfalls—that are almost normally distributed—and it shows the 90% [[confidence belt]] based on the [[binomial distribution]]. The rainfall data are represented by [[plotting position]]s as part of the [[cumulative frequency analysis]]. === Chess ratings === The [[United States Chess Federation]] and FIDE have switched its formula for calculating chess ratings from the normal distribution to the logistic distribution; see the article on [[Elo rating system]] (itself based on the normal distribution). == Related distributions == * Logistic distribution mimics the [[sech distribution]]; they are different cases of the [[Champernowne distribution]]. * If <math>X \sim \mathrm{Logistic}(\mu, s)</math> then <math>kX + \ell \sim \mathrm{Logistic}(k\mu + \ell, |k|s)</math>. * If <math>X \sim </math> [[Uniform distribution (continuous)|U(0, 1)]] then <math>\mu + s \cdot \text{logit}(X) \sim \mathrm{Logistic}(\mu, s)</math>, where <math>\text{logit}(X)=\log X-\log(1-X)</math> is the [[logit]] function. * If <math>X \sim \mathrm{Gumbel}(\mu_X, \beta) </math> and <math> Y \sim \mathrm{Gumbel}(\mu_Y, \beta) </math> independently then <math> X-Y \sim \mathrm{Logistic}(\mu_X-\mu_Y,\beta) \,</math>. * If <math>X </math> and <math>Y \sim \mathrm{Gumbel}(\mu, \beta) </math> then <math>X+Y \nsim \mathrm{Logistic}(2 \mu,\beta) \,</math> (The sum is ''not'' a logistic distribution). <math> E(X+Y) = 2\mu+2\beta\gamma \neq 2\mu = E\left(\mathrm{Logistic}(2 \mu,\beta) \right) </math>. * If ''X'' ~ Logistic(''μ'', ''s'') then exp(''X'') ~ [[log-logistic distribution|LogLogistic]]<math> \left( \alpha = e^\mu, \beta = \frac 1 s \right) </math>, and exp(''X'') + ''γ'' ~ [[shifted log-logistic distribution|shifted log-logistic]]<math> \left( \alpha = e^\mu, \beta = \frac 1 s, \gamma \right) </math>. * If ''X'' ~ [[Exponential distribution|Exponential(1)]] then ::<math>\mu+s\log(e^X -1) \sim \operatorname{Logistic}(\mu,s). </math> * If ''X'', ''Y'' ~ Exponential(λ) independently then ::<math>\mu+s\log\left(\frac X Y \right) \sim \operatorname{Logistic}(\mu,s).</math> * The [[metalog distribution]] is generalization of the logistic distribution, in which power series expansions in terms of <math>p</math> are substituted for logistic parameters <math>\mu</math> and <math>\sigma</math>. The resulting metalog quantile function is highly shape flexible, has a simple closed form, and can be fit to data with linear least squares. == Derivations == === Higher-order moments === The ''n''th-order central moment can be expressed in terms of the quantile function: : <math> \begin{align} \operatorname{E}[(X-\mu)^n] & = \int_{-\infty}^\infty (x-\mu)^n \, dF(x) \\ & = \int_0^1\big(Q(p)-\mu\big)^n \, dp = s^n \int_0^1 \left[\ln\!\left(\frac p {1-p} \right)\right]^n \, dp. \end{align} </math> This integral is well-known<ref>{{OEIS2C|A001896}}</ref> and can be expressed in terms of [[Bernoulli number]]s: : <math> \operatorname{E}[(X-\mu)^n] = s^n\pi^n(2^n-2)\cdot|B_n|.</math> == See also == * [[generalized logistic distribution]] * [[Tukey lambda distribution]] * [[log-logistic distribution]] * [[half-logistic distribution]] * [[logistic regression]] * [[sigmoid function]] == Notes == {{Reflist}} == References == {{commons category}} * {{Cite journal |author1=John S. deCani |author2=Robert A. Stine |name-list-style=amp | year = 1986 | title = A note on deriving the information matrix for a logistic distribution | journal = The American Statistician | volume = 40 |issue=3 | pages = 220–222 | publisher = American Statistical Association | doi=10.2307/2684541 |jstor=2684541 }} * {{Cite book | first = Balakrishnan | last = N. | year = 1992 | title = Handbook of the Logistic Distribution | publisher = Marcel Dekker, New York | isbn = 0-8247-8587-8 }} * {{cite book | last1= Johnson|first1= N. L.|last2= Kotz|first2= S.|last3=N. |first3=Balakrishnan | year = 1995 | title = Continuous Univariate Distributions | others = Vol. 2 | edition = 2nd | isbn = 0-471-58494-0 }} *Modis, Theodore (1992) ''Predictions: Society's Telltale Signature Reveals the Past and Forecasts the Future'', Simon & Schuster, New York. {{isbn|0-671-75917-5}} {{ProbDistributions|continuous-infinite}} {{Authority control}} {{DEFAULTSORT:Logistic Distribution}} [[Category:Continuous distributions]] [[Category:Location-scale family probability distributions]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Authority control
(
edit
)
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Commons category
(
edit
)
Template:Isbn
(
edit
)
Template:Math
(
edit
)
Template:OEIS2C
(
edit
)
Template:ProbDistributions
(
edit
)
Template:Probability distribution
(
edit
)
Template:Reflist
(
edit
)
Template:Rp
(
edit
)
Template:Short description
(
edit
)