Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Binary logarithm
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Computational complexity=== [[File:Binary search into array - example.svg|thumb|upright=1.2|[[Binary search]] in a sorted array, an algorithm whose time complexity involves binary logarithms]] The binary logarithm also frequently appears in the [[analysis of algorithms]],<ref name="gt02"/> not only because of the frequent use of binary number arithmetic in algorithms, but also because binary logarithms occur in the analysis of algorithms based on two-way branching.<ref name="knuth"/> If a problem initially has {{mvar|n}} choices for its solution, and each iteration of the algorithm reduces the number of choices by a factor of two, then the number of iterations needed to select a single choice is again the integral part of {{math|log<sub>2</sub>{{hsp}}''n''}}. This idea is used in the analysis of several [[algorithm]]s and [[data structure]]s. For example, in [[binary search]], the size of the problem to be solved is halved with each iteration, and therefore roughly {{math|log<sub>2</sub>{{hsp}}''n''}} iterations are needed to obtain a solution for a problem of size {{math|n}}.<ref>{{citation |title=Algorithms and Data Structures: The Basic Toolbox|first1=Kurt|last1=Mehlhorn|author1-link=Kurt Mehlhorn|first2=Peter|last2=Sanders|author2-link=Peter Sanders (computer scientist)|publisher=Springer|year=2008|isbn=978-3-540-77977-3|pages=34β36|contribution=2.5 An example β binary search|url=http://people.mpi-inf.mpg.de/~mehlhorn/ftp/Toolbox/Introduction.pdf}}.</ref> Similarly, a perfectly balanced [[binary search tree]] containing {{mvar|n}} elements has height {{math|log<sub>2</sub>(''n'' + 1) β 1}}.<ref>{{citation|title=Applied Combinatorics|first1=Fred|last1=Roberts|author1-link=Fred S. Roberts|first2=Barry|last2=Tesman|edition=2nd|publisher=CRC Press|year=2009|isbn=978-1-4200-9983-6|page=206|url=https://books.google.com/books?id=szBLJUhmYOQC&pg=PA206}}.</ref> The running time of an algorithm is usually expressed in [[big O notation]], which is used to simplify expressions by omitting their constant factors and lower-order terms. Because logarithms in different bases differ from each other only by a constant factor, algorithms that run in {{math|''O''(log<sub>2</sub>{{hsp}}''n'')}} time can also be said to run in, say, {{math|''O''(log<sub>13</sub> ''n'')}} time. The base of the logarithm in expressions such as {{math|''O''(log ''n'')}} or {{math|''O''(''n'' log ''n'')}} is therefore not important and can be omitted.<ref name="clrs"/><ref>{{citation|title=Introduction to the Theory of Computation|first=Michael|last=Sipser|author-link=Michael Sipser|edition=3rd|publisher=Cengage Learning|year=2012|isbn=9781133187790|contribution=Example 7.4|url=https://books.google.com/books?id=P3f6CAAAQBAJ&pg=PA277|pages=277β278}}.</ref> However, for logarithms that appear in the exponent of a time bound, the base of the logarithm cannot be omitted. For example, {{math|''O''(2<sup>log<sub>2</sub>{{hsp}}''n''</sup>)}} is not the same as {{math|''O''(2<sup>ln ''n''</sup>)}} because the former is equal to {{math|''O''(''n'')}} and the latter to {{math|''O''(''n''<sup>0.6931...</sup>)}}. Algorithms with running time {{math|''O''(''n'' log ''n'')}} are sometimes called [[linearithmic]].<ref>{{harvtxt|Sedgewick|Wayne|2011}}, [https://books.google.com/books?id=MTpsAQAAQBAJ&pg=PA186 p. 186].</ref> Some examples of algorithms with running time {{math|''O''(log ''n'')}} or {{math|''O''(''n'' log ''n'')}} are: *[[quicksort|Average time quicksort]] and other [[comparison sort]] algorithms{{sfnmp|1a1 = Cormen|1a2 = Leiserson|1a3=Rivest|1a4=Stein|1y=2001|1p=156|2a1=Goodrich|2a2=Tamassia|2y=2002|2p=238}} *Searching in balanced [[binary search tree]]s{{sfnmp|1a1 = Cormen|1a2 = Leiserson|1a3=Rivest|1a4=Stein|1y=2001|1p=276|2a1=Goodrich|2a2=Tamassia|2y=2002|2p=159}} *[[Exponentiation by squaring]]{{sfnmp|1a1 = Cormen|1a2 = Leiserson|1a3=Rivest|1a4=Stein|1y=2001|1p=879β880|2a1=Goodrich|2a2=Tamassia|2y=2002|2p=464}} *[[Longest increasing subsequence]]<ref>{{citation|title=How to Think About Algorithms|first=Jeff|last=Edmonds|publisher=Cambridge University Press|year=2008|isbn=978-1-139-47175-6|page=302|url=https://books.google.com/books?id=hGuixQMQS_0C&pg=PT280}}.</ref> Binary logarithms also occur in the exponents of the time bounds for some [[divide and conquer algorithm]]s, such as the [[Karatsuba algorithm]] for multiplying {{mvar|n}}-bit numbers in time {{math|''O''(''n''<sup>log<sub>2</sub>{{hsp}}3</sup>)}},{{sfnmp|1a1 = Cormen|1a2 = Leiserson|1a3=Rivest|1a4=Stein|1y=2001|1p=844|2a1=Goodrich|2a2=Tamassia|2y=2002|2p=279}} and the [[Strassen algorithm]] for multiplying {{math|''n'' × ''n''}} matrices in time {{math|''O''(''n''<sup>log<sub>2</sub>{{hsp}}7</sup>)}}.{{sfnp|Cormen|Leiserson|Rivest|Stein|2001|loc=section 28.2.}} The occurrence of binary logarithms in these running times can be explained by reference to the [[master theorem (analysis of algorithms)|master theorem for divide-and-conquer recurrences]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)