Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Asymptotic analysis
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{short description|Description of limiting behavior of a function}} {{about|the behavior of functions as inputs approach infinity or some other limit value|asymptotes in [[geometry]]|Asymptote}} In [[mathematical analysis]], '''asymptotic analysis''', also known as '''asymptotics''', is a method of describing [[Limit (mathematics)|limiting]] behavior. As an illustration, suppose that we are interested in the properties of a function {{math|''f'' (''n'')}} as {{mvar|n}} becomes very large. If {{math|1=''f''(''n'') = ''n''<sup>2</sup> + 3''n''}}, then as {{mvar|n}} becomes very large, the term {{math|3''n''}} becomes insignificant compared to {{math|''n''<sup>2</sup>}}. The function {{math|''f''(''n'')}} is said to be "''asymptotically equivalent'' to {{math|''n''<sup>2</sup>}}, as {{math|''n'' → ∞}}". This is often written symbolically as {{math|''f'' (''n'') ~ ''n''<sup>2</sup>}}, which is read as "{{math|''f''(''n'')}} is asymptotic to {{math|''n''<sup>2</sup>}}". An example of an important asymptotic result is the [[prime number theorem]]. Let {{math|π(''x'')}} denote the [[prime-counting function]] (which is not directly related to the constant [[pi]]), i.e. {{math|π(''x'')}} is the number of [[prime number]]s that are less than or equal to {{mvar|x}}. Then the theorem states that <math display="block">\pi(x)\sim\frac{x}{\ln x}.</math> Asymptotic analysis is commonly used in [[computer science]] as part of the [[analysis of algorithms]] and is often expressed there in terms of [[big O notation]]. == Definition == Formally, given functions {{math|''f'' (''x'')}} and {{math|''g''(''x'')}}, we define a binary relation <math display="block">f(x) \sim g(x) \quad (\text{as } x\to\infty)</math> if and only if {{Harv|de Bruijn| 1981| loc= §1.4}} <math display="block">\lim_{x \to \infty} \frac{f(x)}{g(x)} = 1.</math> The symbol {{math|~}} is the [[tilde]]. The relation is an [[equivalence relation]] on the set of functions of {{mvar|x}}; the functions {{mvar|f}} and {{mvar|g}} are said to be ''asymptotically equivalent''. The [[Domain of a function|domain]] of {{mvar|f}} and {{mvar|g}} can be any set for which the limit is defined: e.g. real numbers, complex numbers, positive integers. The same notation is also used for other ways of passing to a limit: e.g. {{math|''x'' → 0}}, {{math|''x'' ↓ 0}}, {{math|{{abs|''x''}} → 0}}. The way of passing to the limit is often not stated explicitly, if it is clear from the context. Although the above definition is common in the literature, it is problematic if {{math|''g''(''x'')}} is zero infinitely often as {{mvar|x}} goes to the limiting value. For that reason, some authors use an alternative definition. The alternative definition, in [[little-o notation]], is that {{math|''f'' ~ ''g''}} if and only if <math display="block">f(x)=g(x)(1+o(1)).</math> This definition is equivalent to the prior definition if {{math|''g''(''x'')}} is not zero in some [[Neighbourhood (mathematics)|neighbourhood]] of the limiting value.<ref>{{SpringerEOM |id=Asymptotic_equality| title=Asymptotic equality}}</ref><ref>{{Harvtxt|Estrada|Kanwal|2002| loc=§1.2}}</ref> == Properties == If <math>f \sim g</math> and <math>a \sim b</math>, then, under some mild conditions,{{Explain|reason=Many readers might like to know what the exact conditions are.|date=July 2021}} the following hold: * <math>f^r \sim g^r</math>, for every real {{mvar|r}} * <math>\log(f) \sim \log(g)</math> if <math>\lim g \neq 1 </math> * <math>f\times a \sim g\times b</math> * <math>f / a \sim g / b</math> Such properties allow asymptotically equivalent functions to be freely exchanged in many algebraic expressions. Also, if we further have <math>g \sim h</math>, then, because the asymptote is a [[transitive relation]], then we also have <math>f \sim h</math>. ==Examples of asymptotic formulas== * [[Factorial]] <math display="block">n! \sim \sqrt{2\pi n}\left(\frac{n}{e}\right)^n</math> —this is [[Stirling's approximation]] * [[Partition function (number theory)|Partition function]] {{pb}} For a positive integer ''n'', the partition function, ''p''(''n''), gives the number of ways of writing the integer ''n'' as a sum of positive integers, where the order of addends is not considered. <math display="block">p(n)\sim \frac{1}{4n\sqrt{3}} e^{\pi\sqrt{\frac{2n}{3}}}</math> * [[Airy function]] {{pb}} The Airy function, Ai(''x''), is a solution of the differential equation {{math|1=''y″'' − ''xy'' = 0}}; it has many applications in physics. <math display="block">\operatorname{Ai}(x) \sim \frac{e^{-\frac{2}{3} x^\frac{3}{2}}}{2\sqrt{\pi} x^{1/4}}</math> * [[Hankel functions]] <math display="block">\begin{align} H_\alpha^{(1)}(z) &\sim \sqrt{\frac{2}{\pi z}} e^{ i\left(z - \frac{2\pi\alpha - \pi}{4}\right)} \\ H_\alpha^{(2)}(z) &\sim \sqrt{\frac{2}{\pi z}} e^{-i\left(z - \frac{2\pi\alpha - \pi}{4}\right)} \end{align}</math> ==Asymptotic expansion== {{main|Asymptotic expansion}} An [[asymptotic expansion]] of a function {{math|''f''(''x'')}} is in practice an expression of that function in terms of a [[series (mathematics)|series]], the [[partial sum]]s of which do not necessarily converge, but such that taking any initial partial sum provides an asymptotic formula for {{mvar|f}}. The idea is that successive terms provide an increasingly accurate description of the order of growth of {{mvar|f}}. In symbols, it means we have <math>f \sim g_1,</math> but also <math>f - g_1 \sim g_2</math> and <math>f - g_1 - \cdots - g_{k-1} \sim g_{k}</math> for each fixed ''k''. In view of the definition of the <math>\sim</math> symbol, the last equation means <math>f - (g_1 + \cdots + g_k) = o(g_k)</math> in the [[Big O notation#Little-o notation|little o notation]], i.e., <math>f - (g_1 + \cdots + g_k)</math> is much smaller than <math>g_k.</math> The relation <math>f - g_1 - \cdots - g_{k-1} \sim g_{k}</math> takes its full meaning if <math>g_{k+1} = o(g_k)</math> for all ''k'', which means the <math>g_k</math> form an [[asymptotic scale]]. In that case, some authors may [[Abuse of notation|abusively]] write <math>f \sim g_1 + \cdots + g_k</math> to denote the statement <math>f - (g_1 + \cdots + g_k) = o(g_k).</math> One should however be careful that this is not a standard use of the <math>\sim</math> symbol, and that it does not correspond to the definition given in {{section link||Definition}}. In the present situation, this relation <math>g_{k} = o(g_{k-1})</math> actually follows from combining steps ''k'' and ''k''−1; by subtracting <math>f - g_1 - \cdots - g_{k-2} = g_{k-1} + o(g_{k-1})</math> from <math>f - g_1 - \cdots - g_{k-2} - g_{k-1} = g_{k} + o(g_{k}),</math> one gets <math>g_{k} + o(g_{k})=o(g_{k-1}),</math> i.e. <math>g_{k} = o(g_{k-1}).</math> In case the asymptotic expansion does not converge, for any particular value of the argument there will be a particular partial sum which provides the best approximation and adding additional terms will decrease the accuracy. This optimal partial sum will usually have more terms as the argument approaches the limit value. ===Examples of asymptotic expansions=== * [[Gamma function]] <math display="block">\frac{e^x}{x^x \sqrt{2\pi x}} \Gamma(x+1) \sim 1+\frac{1}{12x}+\frac{1}{288x^2}-\frac{139}{51840x^3}-\cdots \ (x \to \infty)</math> * [[Exponential integral]] <math display="block">xe^xE_1(x) \sim \sum_{n=0}^\infty \frac{(-1)^nn!}{x^n} \ (x \to \infty) </math> * [[Error function]] <math display="block"> \sqrt{\pi}x e^{x^2}\operatorname{erfc}(x) \sim 1+\sum_{n=1}^\infty (-1)^n \frac{(2n-1)!!}{n!(2x^2)^n} \ (x \to \infty)</math> where {{math|''m''!!}} is the [[double factorial]]. ===Worked example=== Asymptotic expansions often occur when an ordinary series is used in a formal expression that forces the taking of values outside of its domain of convergence. For example, we might start with the ordinary series <math display="block">\frac{1}{1-w}=\sum_{n=0}^\infty w^n</math> The expression on the left is valid on the entire complex plane <math>w \ne 1</math>, while the right hand side converges only for <math>|w|< 1</math>. Multiplying by <math>e^{-w/t}</math> and integrating both sides yields <math display="block"> \int_0^\infty \frac{e^{-\frac{w}{t}}}{1 - w} \, dw = \sum_{n=0}^\infty t^{n+1} \int_0^\infty e^{-u} u^n \, du</math> The integral on the left hand side can be expressed in terms of the [[exponential integral]]. The integral on the right hand side, after the substitution <math>u=w/t</math>, may be recognized as the [[gamma function]]. Evaluating both, one obtains the asymptotic expansion <math display="block">e^{-\frac{1}{t}} \operatorname{Ei}\left(\frac{1}{t}\right) = \sum _{n=0}^\infty n! \; t^{n+1} </math> Here, the right hand side is clearly not convergent for any non-zero value of ''t''. However, by keeping ''t'' small, and truncating the series on the right to a finite number of terms, one may obtain a fairly good approximation to the value of <math>\operatorname{Ei}(1/t)</math>. Substituting <math>x = -1/t</math> and noting that <math>\operatorname{Ei}(x) = -E_1(-x)</math> results in the asymptotic expansion given earlier in this article. == Asymptotic distribution == {{main|Asymptotic distribution}} In [[mathematical statistics]], an '''[[asymptotic distribution]]''' is a hypothetical distribution that is in a sense the "limiting" distribution of a sequence of distributions. A distribution is an ordered set of random variables {{math|''Z''<sub>''i''</sub>}} for {{math|1=''i'' = 1, …, ''n''}}, for some positive integer {{math|''n''}}. An asymptotic distribution allows {{math|''i''}} to range without bound, that is, {{math|''n''}} is infinite. A special case of an asymptotic distribution is when the late entries go to zero—that is, the {{math|''Z''<sub>''i''</sub>}} go to 0 as {{math|''i''}} goes to infinity. Some instances of "asymptotic distribution" refer only to this special case. This is based on the notion of an [[asymptotic]] function which cleanly approaches a constant value (the ''asymptote'') as the independent variable goes to infinity; "clean" in this sense meaning that for any desired closeness epsilon there is some value of the independent variable after which the function never differs from the constant by more than epsilon. An [[asymptote]] is a straight line that a curve approaches but never meets or crosses. Informally, one may speak of the curve meeting the asymptote "at infinity" although this is not a precise definition. In the equation <math>y = \frac{1}{x},</math> ''y'' becomes arbitrarily small in magnitude as ''x'' increases. ==Applications== Asymptotic analysis is used in several [[mathematical sciences]]. In [[statistics]], asymptotic theory provides limiting approximations of the [[probability distribution]] of [[sample statistic]]s, such as the [[Likelihood-ratio test|likelihood ratio]] [[statistic]] and the [[expected value]] of the [[deviance (statistics)|deviance]]. Asymptotic theory does not provide a method of evaluating the finite-sample distributions of sample statistics, however. Non-asymptotic bounds are provided by methods of [[approximation theory]]. Examples of applications are the following. * In [[applied mathematics]], asymptotic analysis is used to build [[numerical method]]s to approximate [[equation]] solutions. * In [[mathematical statistics]] and [[probability theory]], asymptotics are used in analysis of long-run or large-sample behaviour of random variables and estimators. * In [[computer science]] in the [[analysis of algorithms]], considering the performance of algorithms. * The behavior of [[physical system]]s, an example being [[statistical mechanics]]. * In [[accident analysis]] when identifying the causation of crash through count modeling with large number of crash counts in a given time and space. Asymptotic analysis is a key tool for exploring the [[ordinary differential equation|ordinary]] and [[partial differential equation|partial]] differential equations which arise in the [[mathematical model]]ling of real-world phenomena.<ref name="Howison">Howison, S. (2005), ''[https://books.google.com/books?id=A2Hy_54Y1MsC&q=%22asymptotic+analysis%22 Practical Applied Mathematics]'', [[Cambridge University Press]]</ref> An illustrative example is the derivation of the [[Boundary layer#Boundary layer equations|boundary layer equations]] from the full [[Navier-Stokes equations]] governing fluid flow. In many cases, the asymptotic expansion is in power of a small parameter, {{mvar|ε}}: in the boundary layer case, this is the [[dimensional analysis|nondimensional]] ratio of the boundary layer thickness to a typical length scale of the problem. Indeed, applications of asymptotic analysis in mathematical modelling often<ref name="Howison" /> center around a nondimensional parameter which has been shown, or assumed, to be small through a consideration of the scales of the problem at hand. Asymptotic expansions typically arise in the approximation of certain integrals ([[Laplace's method]], [[saddle-point method]], [[method of steepest descent]]) or in the approximation of probability distributions ([[Edgeworth series]]). The [[Feynman graphs]] in [[quantum field theory]] are another example of asymptotic expansions which often do not converge. === Asymptotic versus Numerical Analysis === De Bruijn illustrates the use of asymptotics in the following dialog between Dr. N.A., a Numerical Analyst, and Dr. A.A., an Asymptotic Analyst: <blockquote>N.A.: I want to evaluate my function <math>f(x)</math> for large values of <math>x</math>, with a relative error of at most 1%. A.A.: <math>f(x)=x^{-1}+\mathrm O(x^{-2}) \qquad (x\to\infty)</math>. N.A.: I am sorry, I don't understand. A.A.: <math>|f(x)-x^{-1}|<8x^{-2} \qquad (x>10^4).</math> N.A.: But my value of <math>x</math> is only 100. A.A.: Why did you not say so? My evaluations give<blockquote><math>|f(x)-x^{-1}|<57000x^{-2} \qquad (x>100).</math></blockquote> N.A.: This is no news to me. I know already that <math>0<f(100)<1</math>. A.A.: I can gain a little on some of my estimates. Now I find that<blockquote><math>|f(x)-x^{-1}|<20x^{-2} \qquad (x>100).</math></blockquote> N.A.: I asked for 1%, not for 20%. A.A.: It is almost the best thing I possibly can get. Why don't you take larger values of <math>x</math>? N.A.: !!! I think it's better to ask my electronic computing machine. Machine: f(100) = 0.01137 42259 34008 67153 A.A.: Haven't I told you so? My estimate of 20% was not far off from the 14% of the real error. N.A.: !!! . . . ! Some days later, Miss N.A. wants to know the value of f(1000), but her machine would take a month of computation to give the answer. She returns to her Asymptotic Colleague, and gets a fully satisfactory reply.<ref>{{Cite book |last=Bruijn |first=Nicolaas Govert de |title=Asymptotic methods in analysis |date=1981 |publisher=Dover publ |isbn=978-0-486-64221-5 |series=Dover books on advanced mathematics |location=New York |pages=19}}</ref></blockquote> ==See also== {{div col|colwidth=30em}} *{{annotated link|Asymptote}} *{{annotated link|Asymptotic computational complexity}} *{{annotated link|Asymptotic density}} *{{annotated link|Asymptotic theory (statistics)}} *{{annotated link|Asymptotology}} *{{annotated link|Big O notation}} *{{annotated link|Leading-order term}} *{{annotated link|Method of dominant balance}} *{{annotated link|Method of matched asymptotic expansions}} *{{annotated link|Watson's lemma}} {{div col end}} ==Notes== <references /> ==References== * {{citation | title= From Divergent Power Series To Analytic Functions | last= Balser | first= W. | year= 1994 | publisher= [[Springer-Verlag]]|url=https://books.google.com/books?id=V-17CwAAQBAJ| isbn= 9783540485940 }} * {{citation | first= N. G. | last= de Bruijn | author-link= Nicolaas Govert de Bruijn | title= Asymptotic Methods in Analysis | year= 1981 | publisher= [[Dover Publications]] |url=https://books.google.com/books?id=Oqj9AgAAQBAJ| isbn= 9780486642215 }} * {{citation | first1= R. | last1= Estrada | first2= R. P. | last2= Kanwal| title= A Distributional Approach to Asymptotics | year= 2002 | publisher= [[Birkhäuser]]|url=https://books.google.com/books?id=X3cECAAAQBAJ| isbn= 9780817681302 }} * {{citation| last=Miller | first= P. D. | title= Applied Asymptotic Analysis | year= 2006 | publisher= [[American Mathematical Society]]|url=https://books.google.com/books?id=KQvqBwAAQBAJ| isbn= 9780821840788 }} * {{citation | last= Murray | first = J. D. | title = Asymptotic Analysis | year = 1984 | publisher = Springer | url = https://books.google.com/books?id=PC3rBwAAQBAJ | isbn= 9781461211228 }} * {{citation|last1=Paris|first1= R. B.|last2= Kaminsky|first2= D. |year=2001| title= Asymptotics and Mellin-Barnes Integrals| publisher= [[Cambridge University Press]]|url=https://www.researchgate.net/publication/39064661}} ==External links== *[https://www.iospress.nl/journal/asymptotic-analysis/ ''Asymptotic Analysis''] —home page of the journal, which is published by [[IOS Press]] * [https://web.archive.org/web/20070422145944/http://swan.econ.ohio-state.edu/econ840/note4.pdf A paper on time series analysis using asymptotic distribution] [[Category:Asymptotic analysis| ]] [[Category:Series (mathematics)]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:About
(
edit
)
Template:Annotated link
(
edit
)
Template:Citation
(
edit
)
Template:Cite book
(
edit
)
Template:Div col
(
edit
)
Template:Div col end
(
edit
)
Template:Explain
(
edit
)
Template:Harv
(
edit
)
Template:Harvtxt
(
edit
)
Template:Main
(
edit
)
Template:Math
(
edit
)
Template:Mvar
(
edit
)
Template:Pb
(
edit
)
Template:Section link
(
edit
)
Template:Short description
(
edit
)
Template:SpringerEOM
(
edit
)