Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Heaviside step function
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Analytic approximations == Approximations to the Heaviside step function are of use in [[biochemistry]] and [[neuroscience]], where [[logistic function|logistic]] approximations of step functions (such as the [[Hill equation (biochemistry)|Hill]] and the [[Michaelis–Menten kinetics|Michaelis–Menten equations]]) may be used to approximate binary cellular switches in response to chemical signals. [[File:Step function approximation.png|alt=A set of functions that successively approach the step function|thumb|500x500px|<math>\tfrac{1}{2} + \tfrac{1}{2} \tanh(kx) = \frac{1}{1+e^{-2kx}}</math><br>approaches the step function as {{math|''k'' → ∞}}.]] For a [[Smooth function|smooth]] approximation to the step function, one can use the [[logistic function]] <math display="block">H(x) \approx \tfrac{1}{2} + \tfrac{1}{2}\tanh kx = \frac{1}{1+e^{-2kx}},</math> where a larger {{mvar|k}} corresponds to a sharper transition at {{math|''x'' {{=}} 0}}. If we take {{math|''H''(0) {{=}} {{sfrac|1|2}}}}, equality holds in the limit: <math display="block">H(x)=\lim_{k \to \infty}\tfrac{1}{2}(1+\tanh kx)=\lim_{k \to \infty}\frac{1}{1+e^{-2kx}}.</math> There are [[Sigmoid function#Examples|many other smooth, analytic approximations]] to the step function.<ref>{{MathWorld | urlname=HeavisideStepFunction | title=Heaviside Step Function}}</ref> Among the possibilities are: <math display="block">\begin{align} H(x) &= \lim_{k \to \infty} \left(\tfrac{1}{2} + \tfrac{1}{\pi}\arctan kx\right)\\ H(x) &= \lim_{k \to \infty}\left(\tfrac{1}{2} + \tfrac12\operatorname{erf} kx\right) \end{align}</math> These limits hold [[pointwise]] and in the sense of [[distribution (mathematics)|distributions]]. In general, however, pointwise convergence need not imply distributional convergence, and vice versa distributional convergence need not imply pointwise convergence. (However, if all members of a pointwise convergent sequence of functions are uniformly bounded by some "nice" function, then [[Lebesgue dominated convergence theorem|convergence holds in the sense of distributions too]].) In general, any [[cumulative distribution function]] of a [[continuous distribution|continuous]] [[probability distribution]] that is peaked around zero and has a parameter that controls for [[variance]] can serve as an approximation, in the limit as the variance approaches zero. For example, all three of the above approximations are [[cumulative distribution function|cumulative distribution functions]] of common probability distributions: the [[logistic distribution|logistic]], [[Cauchy distribution|Cauchy]] and [[normal distribution|normal]] distributions, respectively.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)