Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
History of computing
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Numbers== Eventually, the concept of numbers became concrete and familiar enough for counting to arise, at times with sing-song [[mnemonic]]s to teach [[sequence]]s to others. All known human languages, except the [[Piraha language]], have words for at least the [[numeral (linguistics)|numerals]] "one" and "two", and even some animals like the [[common blackbird|blackbird]] can distinguish a surprising number of items.<ref>{{cite book|first1=Konrad|last1=Lorenz|author-link1=Konrad Lorenz|year=1961|title=King Solomon's Ring|translator1=Marjorie Kerr Wilson|publisher=Methuen|location=London|isbn=0-416-53860-6}}</ref> Advances in the [[numeral system]] and [[mathematical notation]] eventually led to the discovery of mathematical operations such as addition, subtraction, multiplication, division, squaring, square root, and so forth. Eventually, the operations were formalized, and concepts about the operations became understood well enough to be [[theorem|stated formally]], and even [[mathematical proof|proven]]. See, for example, [[Euclidean algorithm|Euclid's algorithm]] for finding the greatest common divisor of two numbers. By the High Middle Ages, the [[positional notation|positional]] [[Hindu–Arabic numeral system]] had reached [[Europe]], which allowed for the systematic computation of numbers. During this period, the representation of a calculation on [[paper]] allowed the calculation of [[mathematical expression]]s, and the tabulation of [[mathematical function]]s such as the [[square root]] and the [[common logarithm]] (for use in multiplication and division), and the [[trigonometric function]]s. By the time of [[Isaac Newton]]'s research, paper or vellum was an important [[computing resource]], and even in our present time, researchers like [[Enrico Fermi]] would cover random scraps of paper with calculation, to satisfy their curiosity about an equation.<ref>{{cite web|title=DIY: Enrico Fermi's Back of the Envelope Calculations|url=http://www.knowqout.com/science-technology/diy-enrico-fermis-back-of-the-envelope-calculations/}}</ref> Even into the period of programmable calculators, [[Richard Feynman]] would unhesitatingly compute any steps that overflowed the [[Computer memory|memory]] of the calculators, by hand, just to learn the answer; by 1976 Feynman had purchased an [[HP-25]] calculator with a 49 program-step capacity; if a differential equation required more than 49 steps to solve, he could just continue his computation by hand.<ref>"Try numbers" was one of Feynman's problem solving techniques.</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)