Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Algorithmic efficiency
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Theoretical analysis=== In the theoretical [[analysis of algorithms]], the normal practice is to estimate their complexity in the asymptotic sense. The most commonly used notation to describe resource consumption or "complexity" is [[Donald Knuth]]'s [[Big O notation]], representing the complexity of an algorithm as a function of the size of the input <math display="inline">n</math>. Big O notation is an [[Asymptote|asymptotic]] measure of function complexity, where <math display="inline">f(n) = O\bigl( g(n)\bigr)</math> roughly means the time requirement for an algorithm is proportional to <math>g(n)</math>, omitting [[lower-order terms]] that contribute less than <math>g(n)</math> to the growth of the function as <math display="inline">n</math> [[limit (mathematics)|grows arbitrarily large]]. This estimate may be misleading when <math display="inline">n</math> is small, but is generally sufficiently accurate when <math display="inline">n</math> is large as the notation is asymptotic. For example, bubble sort may be faster than [[merge sort]] when only a few items are to be sorted; however either implementation is likely to meet performance requirements for a small list. Typically, programmers are interested in algorithms that [[Scalability|scale]] efficiently to large input sizes, and merge sort is preferred over bubble sort for lists of length encountered in most data-intensive programs. Some examples of Big O notation applied to algorithms' asymptotic time complexity include: {| class="wikitable" |- !Notation !! Name !! Examples |- |<math>O(1)</math>||[[Constant time|constant]] || Finding the median from a sorted list of measurements; Using a constant-size [[lookup table]]; Using a suitable [[hash function]] for looking up an item. |- |<math>O(\log n)</math>||[[Logarithmic time|logarithmic]] || Finding an item in a sorted array with a [[Binary search algorithm|binary search]] or a balanced search [[Tree data structure|tree]] as well as all operations in a [[Binomial heap]]. |- |<math>O(n)</math>||[[linear time|linear]] || Finding an item in an unsorted list or a malformed tree (worst case) or in an unsorted array; Adding two ''n''-bit integers by [[Ripple carry adder|ripple carry]]. |- |<math>O(n\log n)</math>||[[Linearithmic time|linearithmic]], loglinear, or quasilinear || Performing a [[Fast Fourier transform]]; [[heapsort]], [[quicksort]] ([[Best, worst and average case|best and average case]]), or [[merge sort]] |- |<math>O(n^2)</math>||[[quadratic time|quadratic]] ||[[Multiplication|Multiplying]] two ''n''-digit numbers by [[Long multiplication|a simple algorithm]]; [[bubble sort]] (worst case or naive implementation), [[Shell sort]], quicksort ([[Best, worst and average case|worst case]]), [[selection sort]] or [[insertion sort]] |- |<math>O(c^n),\;c>1</math>||[[exponential time|exponential]] || Finding the optimal (non-[[Travelling salesman problem#Heuristic and approximation algorithms|approximate]]) solution to the [[travelling salesman problem]] using [[dynamic programming]]; [[Boolean satisfiability problem|determining if two logical statements are equivalent]] using [[brute-force search]] |}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)