Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Logarithm
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Computational complexity=== [[Analysis of algorithms]] is a branch of [[computer science]] that studies the [[time complexity|performance]] of [[algorithm]]s (computer programs solving a certain problem).<ref name=Wegener>{{Citation|last1=Wegener|first1=Ingo| title=Complexity theory: exploring the limits of efficient algorithms|publisher=[[Springer-Verlag]]|location=Berlin, New York|isbn=978-3-540-21045-0|year=2005}}, pp. 1β2</ref> Logarithms are valuable for describing algorithms that [[Divide and conquer algorithm|divide a problem]] into smaller ones, and join the solutions of the subproblems.<ref>{{Citation|last1=Harel|first1=David|last2=Feldman|first2=Yishai A.|title=Algorithmics: the spirit of computing|location=New York|publisher=[[Addison-Wesley]]|isbn=978-0-321-11784-7|year=2004}}, p. 143</ref> For example, to find a number in a sorted list, the [[binary search algorithm]] checks the middle entry and proceeds with the half before or after the middle entry if the number is still not found. This algorithm requires, on average, {{math|log<sub>2</sub> (''N'')}} comparisons, where {{mvar|N}} is the list's length.<ref>{{citation | last = Knuth | first = Donald | author-link = Donald Knuth | title = The Art of Computer Programming | publisher = Addison-Wesley |location=Reading, MA | year= 1998| isbn = 978-0-201-89685-5 | title-link = The Art of Computer Programming }}, section 6.2.1, pp. 409β26</ref> Similarly, the [[merge sort]] algorithm sorts an unsorted list by dividing the list into halves and sorting these first before merging the results. Merge sort algorithms typically require a time [[big O notation|approximately proportional to]] {{math|''N'' Β· log(''N'')}}.<ref>{{Harvard citations|last = Knuth | first = Donald|year=1998|loc=section 5.2.4, pp. 158β68|nb=yes}}</ref> The base of the logarithm is not specified here, because the result only changes by a constant factor when another base is used. A constant factor is usually disregarded in the analysis of algorithms under the standard [[uniform cost model]].<ref name=Wegener20>{{Citation|last1=Wegener|first1=Ingo| title=Complexity theory: exploring the limits of efficient algorithms|publisher=[[Springer-Verlag]]|location=Berlin, New York|isbn=978-3-540-21045-0|year=2005|page=20}}</ref> A function {{math|''f''(''x'')}} is said to [[Logarithmic growth|grow logarithmically]] if {{math|''f''(''x'')}} is (exactly or approximately) proportional to the logarithm of {{mvar|x}}. (Biological descriptions of organism growth, however, use this term for an exponential function.<ref>{{Citation|last1=Mohr|first1=Hans|last2=Schopfer|first2=Peter|title=Plant physiology|publisher=Springer-Verlag|location=Berlin, New York|isbn=978-3-540-58016-4|year=1995|url-access=registration|url=https://archive.org/details/plantphysiology0000mohr}}, chapter 19, p. 298</ref>) For example, any [[natural number]] {{mvar|N}} can be represented in [[Binary numeral system|binary form]] in no more than {{math|log<sub>2</sub> ''N'' + 1}} [[bit]]s. In other words, the amount of [[memory (computing)|memory]] needed to store {{mvar|N}} grows logarithmically with {{mvar|N}}.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)