Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Computational complexity
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Use in algorithm design== Evaluating the complexity of an algorithm is an important part of [[algorithm design]], as this gives useful information on the performance that may be expected. It is a common misconception that the evaluation of the complexity of algorithms will become less important as a result of [[Moore's law]], which posits the [[exponential growth]] of the power of modern [[computer]]s. This is wrong because this power increase allows working with large input data ([[big data]]). For example, when one wants to sort alphabetically a list of a few hundreds of entries, such as the [[bibliography]] of a book, any algorithm should work well in less than a second. On the other hand, for a list of a million of entries (the phone numbers of a large town, for example), the elementary algorithms that require <math>O(n^2)</math> comparisons would have to do a trillion of comparisons, which would need around three hours at the speed of 10 million of comparisons per second. On the other hand, the [[quicksort]] and [[merge sort]] require only <math>n\log_2 n</math> comparisons (as average-case complexity for the former, as worst-case complexity for the latter). For {{math|1=''n'' = 1,000,000}}, this gives approximately 30,000,000 comparisons, which would only take 3 seconds at 10 million comparisons per second. Thus the evaluation of the complexity may allow eliminating many inefficient algorithms before any implementation. This may also be used for tuning complex algorithms without testing all variants. By determining the most costly steps of a complex algorithm, the study of complexity allows also focusing on these steps the effort for improving the efficiency of an implementation.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)