Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Time complexity
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== <span id="Linearithmic time"></span> Quasilinear time == An algorithm is said to run in '''quasilinear time''' (also referred to as '''log-linear time''') if <math>T(n)=O(n\log^kn)</math> for some positive constant {{mvar|k}};<ref>{{cite journal | last1 = Naik | first1 = Ashish V. | last2 = Regan | first2 = Kenneth W. | last3 = Sivakumar | first3 = D. | doi = 10.1016/0304-3975(95)00031-Q | issue = 2 | journal = [[Theoretical Computer Science (journal)|Theoretical Computer Science]] | mr = 1355592 | pages = 325–349 | title = On quasilinear-time complexity theory | url = http://www.cse.buffalo.edu/~regan/papers/pdf/NRS95.pdf | volume = 148 | year = 1995| doi-access = free }}</ref> '''linearithmic time''' is the case <math>k=1</math>.<ref>{{cite book | last1 = Sedgewick | first1 = Robert | last2 = Wayne | first2 = Kevin | edition = 4th | page = 186 | publisher = Pearson Education | title = Algorithms | url = https://algs4.cs.princeton.edu/home/ | year = 2011}}</ref> Using [[soft O notation]] these algorithms are <math>\tilde{O}(n)</math>. Quasilinear time algorithms are also <math>O(n^{1+\varepsilon })</math> for every constant <math>\varepsilon >0</math> and thus run faster than any polynomial time algorithm whose time bound includes a term <math>n^c</math> for any <math>c>1</math>. Algorithms which run in quasilinear time include: * [[In-place merge sort]], <math>O(n\log^2n)</math> * [[Quicksort]], <math>O(n\log n)</math>, in its randomized version, has a running time that is <math>O(n\log n)</math> in expectation on the worst-case input. Its non-randomized version has an <math>O(n\log n)</math> running time only when considering average case complexity. * [[Heapsort]], <math>O(n\log n)</math>, [[merge sort]], [[introsort]], binary tree sort, [[smoothsort]], [[patience sorting]], etc. in the worst case * [[Fast Fourier transform]]s, <math>O(n\log n)</math> * [[Monge array]] calculation, <math>O(n\log n)</math> In many cases, the <math>O(n\log n)</math> running time is simply the result of performing a <math>\Theta (\log n)</math> operation {{mvar|n}} times (for the notation, see {{slink|Big O notation|Family of Bachmann–Landau notations}}). For example, [[binary tree sort]] creates a [[binary tree]] by inserting each element of the {{mvar|n}}-sized array one by one. Since the insert operation on a [[self-balancing binary search tree]] takes <math>O(\log n)</math> time, the entire algorithm takes <math>O(n\log n)</math> time. [[Comparison sort]]s require at least <math>\Omega (n\log n)</math> comparisons in the worst case because <math>\log (n!)=\Theta (n\log n)</math>, by [[Stirling's approximation]]. They also frequently arise from the [[recurrence relation]] <math display="inline">T(n) = 2T\left(\frac{n}{2}\right)+O(n)</math>.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)