Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Logistic function
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Derivative === [[File:Logistic function derivatives.png|class=skin-invert-image|thumb|The logistic function and its first 3 derivatives]] The standard logistic function has an easily calculated [[derivative]]. The derivative is known as the density of the [[logistic distribution]]: <math display="block">f(x) = \frac{1}{1 + e^{-x}} = \frac{e^x}{1 + e^x},</math> <math display="block">\begin{align} \frac{d}{dx} f(x) &= \frac{e^x \cdot (1 + e^x) - e^x \cdot e^x}{{\left(1 + e^x\right)}^2} \\[1ex] &= \frac{e^x}{{\left(1 + e^x\right)}^2} \\[1ex] &= \left(\frac{e^x}{1 + e^x}\right) \left(\frac{1}{1 + e^x}\right) \\[1ex] &= \left(\frac{e^x}{1 + e^x}\right) \left(1-\frac{e^x}{1 + e^x}\right) \\[1.2ex] &= f(x)\left(1 - f(x)\right) \end{align}</math>from which all higher derivatives can be derived algebraically. For example, <math>f'' = (1-2f)(1-f)f </math>. The logistic distribution is a [[location–scale family]], which corresponds to parameters of the logistic function. If {{tmath|1=L = 1}} is fixed, then the midpoint {{tmath|x_0}} is the location and the slope {{tmath|k}} is the scale. === Integral === Conversely, its [[antiderivative]] can be computed by the [[Integration by substitution|substitution]] <math>u = 1 + e^x</math>, since <math display=block>f(x) = \frac{e^x}{1 + e^x} = \frac{u'}{u},</math> so (dropping the [[constant of integration]]) <math display="block">\int \frac{e^x}{1 + e^x}\,dx = \int \frac{1}{u}\,du = \ln u = \ln (1 + e^x).</math> In [[artificial neural network]]s, this is known as the ''[[softplus]]'' function and (with scaling) is a smooth approximation of the [[ramp function]], just as the logistic function (with scaling) is a smooth approximation of the [[Heaviside step function]]. === Taylor series === The standard logistic function is [[Analytic function|analytic]] on the whole real line since <math>f : \mathbb{R} \to \mathbb{R}</math>, <math>f(x) = \frac{1}{1+e^{-x}} = h(g(x)) </math> where <math>g : \mathbb{R} \to \mathbb{R}</math>, <math>g(x) = 1 + e^{-x}</math> and <math>h : (0, \infty) \to (0, \infty)</math>, <math>h(x) = \frac{1}{x}</math> are analytic on their domains, and the composition of analytic functions is again analytic. A formula for the ''n''th derivative of the standard logistic function is <math display="block">\frac{d^n f}{dx^n} = \sum_{i=1}^n \frac{\left(\sum_{j=1}^n {\left(-1\right)}^{i+j} \binom{i}{j} j^n\right) e^{-ix}}{{\left(1+e^{-x}\right)}^{i+1}} </math> therefore its [[Taylor series]] about the point <math>a </math> is <math display="block">f(x) = f(a)(x-a) + \sum_{n=1}^{\infty} \sum_{i=1}^n \frac{\left(\sum_{j=1}^n {\left(-1\right)}^{i+j} \binom{i}{j} j^n\right) e^{-ix}}{{\left(1 + e^{-x}\right)}^{i+1}} \frac{{\left(x-a\right)}^n}{n!} . </math> === Logistic differential equation === The unique standard logistic function is the solution of the simple first-order non-linear [[ordinary differential equation]] <math display="block">\frac{d}{dx}f(x) = f(x)\big(1 - f(x)\big)</math> with [[boundary condition]] <math>f(0) = 1/2</math>. This equation is the continuous version of the [[logistic map]]. Note that the reciprocal logistic function is solution to a simple first-order ''linear'' ordinary differential equation.<ref>{{cite journal |last1=Kocian |first1=Alexander |last2=Carmassi |first2=Giulia|last3=Cela |first3=Fatjon |last4=Incrocci|first4=Luca|last5=Milazzo|first5=Paolo|last6=Chessa|first6=Stefano |title=Bayesian Sigmoid-Type Time Series Forecasting with Missing Data for Greenhouse Crops |journal= Sensors|date=7 June 2020 |volume=20 |issue=11 |page=3246 |doi=10.3390/s20113246 |pmid=32517314 |pmc=7309099 |bibcode=2020Senso..20.3246K |doi-access=free }}</ref> The qualitative behavior is easily understood in terms of the [[Phase line (mathematics)|phase line]]: the derivative is 0 when the function is 1; and the derivative is positive for <math>f</math> between 0 and 1, and negative for <math>f</math> above 1 or less than 0 (though negative populations do not generally accord with a physical model). This yields an unstable equilibrium at 0 and a stable equilibrium at 1, and thus for any function value greater than 0 and less than 1, it grows to 1.<!-- The above equation can be rewritten in the following steps: <math display="block">\frac{d}{dx}f(x) = f(x)(1-f(x)) </math> <math display="block">\frac{dy}{dx} = y(1-y) </math> <math display="block">\frac{dy}{dx} = y - y^2 </math> <math display="block">\frac{dy}{dx} - y = -y^2 </math> trivial algebraic manipulation--> The logistic equation is a special case of the [[Bernoulli differential equation]] and has the following solution: <math display="block">f(x) = \frac{e^x}{e^x + C}.</math> Choosing the constant of integration <math>C = 1</math> gives the other well known form of the definition of the logistic curve: <math display="block">f(x) = \frac{e^x}{e^x + 1} = \frac{1}{1 + e^{-x}}.</math> More quantitatively, as can be seen from the analytical solution, the logistic curve shows early [[exponential growth]] for negative argument, which reaches to linear growth of slope 1/4 for an argument near 0, then approaches 1 with an exponentially decaying gap. The differential equation derived above is a special case of a general differential equation that only models the sigmoid function for <math>x > 0</math>. In many modeling applications, the more ''general form''<ref>Kyurkchiev, Nikolay, and Svetoslav Markov. "Sigmoid functions: some approximation and modelling aspects". LAP LAMBERT Academic Publishing, Saarbrucken (2015).</ref> <math display="block">\frac{df(x)}{dx} = \frac{k}{L} f(x)\big(L - f(x)\big), \quad f(0) = \frac {L} {1 + e^{k x_0}}</math> can be desirable. Its solution is the shifted and scaled [[sigmoid function]] <math>L \sigma \big(k(x - x_0)\big) = \frac {L} {1 + e^{-k(x - x_0)}}</math>.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)