Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Technological singularity
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Speed improvements == Both for human and artificial intelligence, hardware improvements increase the rate of future hardware improvements. An analogy to [[Moore's Law]] suggests that if the first doubling of speed took 18 months, the second would take 18 subjective months; or 9 external months, whereafter, four months, two months, and so on towards a speed singularity.<ref name="arstechnica">{{cite web |last=Siracusa |first=John |date=2009-08-31 |title=Mac OS X 10.6 Snow Leopard: the Ars Technica review |url=https://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/8 |url-status=live |archive-url=https://web.archive.org/web/20110903191143/http://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/8 |archive-date=2011-09-03 |access-date=2011-09-09 |work=Ars Technica}}</ref><ref name="yudkowsky1996"/> Some upper limit on speed may eventually be reached. Jeff Hawkins has stated that a self-improving computer system would inevitably run into upper limits on computing power: "in the end there are limits to how big and fast computers can run. We would end up in the same place; we'd just get there a bit faster. There would be no singularity."<ref name="ieee-lumi"/> It is difficult to directly compare [[silicon]]-based hardware with [[neuron]]s. But {{Harvtxt|Berglas|2008}} notes that computer [[speech recognition]] is approaching human capabilities, and that this capability seems to require 0.01% of the volume of the brain. This analogy suggests that modern computer hardware is within a few orders of magnitude of being as powerful as the [[human brain]], as well as taking up a lot less space. However, the costs of training systems with [[deep learning]] may be larger.{{citation needed |date=February 2025}}{{efn |[[Large language model]]s such as [[ChatGPT]] and [[Llama (language model)|Llama]] require millions of hours of graphics processing unit ([[Graphics processing unit|GPU]]) time. Training Meta's Llama in 2023 took 21 days on 2048 [[Nvidia A100|NVIDIA A100]] GPUs, thus requiring hardware substantially larger than a brain. Training took around a million GPU hours, with an estimated cost of over $2 million. Even so, it is far smaller, and thus easier to train, than a LLM such as ChatGPT, which as of 2023 had 175 billion parameters to adjust, compared to 65 million for Llama.<ref>{{Cite web |last=Leswing |first=Jonathan Vanian,Kif |date=2023-03-13 |title=ChatGPT and generative AI are booming, but the costs can be extraordinary |url=https://www.cnbc.com/2023/03/13/chatgpt-and-generative-ai-are-booming-but-at-a-very-expensive-price.html |access-date=2025-02-08 |website=CNBC |language=en}}</ref> Training Google's [[Gemini (chatbot)|Gemini LLM]] is estimated to have cost between $30 million and $191 million, similar to that of ChatGPT 4.<ref>{{Cite web |last=Buchholz |first=Katharina |title=The Extreme Cost Of Training AI Models |url=https://www.forbes.com/sites/katharinabuchholz/2024/08/23/the-extreme-cost-of-training-ai-models/ |access-date=2025-02-08 |website=Forbes |language=en}}</ref> }} ===Exponential growth=== [[Image:PPTMooresLawai.jpg|thumb|upright=2|left|Ray Kurzweil writes that, due to [[paradigm shift]]s, a trend of exponential growth extends [[Moore's law]] from [[integrated circuits]] to earlier [[transistor]]s, [[vacuum tube]]s, [[relay]]s, and [[electromechanics|electromechanical]] computers. He predicts that the exponential growth will continue, and that in a few decades the computing power of all computers will exceed that of ("unenhanced") human brains, with superhuman [[artificial intelligence]] appearing around the same time.]] [[File:Moore's Law over 120 Years.png|thumb|upright=2|An updated version of Moore's law over 120 Years (based on [[Ray Kurzweil|Kurzweil's]] [[c:File:PPTMooresLawai.jpg|graph]]). The 7 most recent data points are all [[Nvidia GPUs]].]] The exponential growth in computing technology suggested by Moore's law is commonly cited as a reason to expect a singularity in the relatively near future, and a number of authors have proposed generalizations of Moore's law. Computer scientist and futurist Hans Moravec proposed in a 1998 book<ref>{{cite book |author=Moravec |first=Hans |url=https://books.google.com/books?id=fduW6KHhWtQC&pg=PA61 |title=Robot: Mere Machine to Transcendent Mind |publisher=Oxford University Press |year=1999 |isbn=978-0-19-513630-2 |page=61 |language=en}}</ref> that the exponential growth curve could be extended back through earlier computing technologies prior to the [[integrated circuit]]. [[Ray Kurzweil]] postulates a [[law of accelerating returns]] in which the speed of technological change (and more generally, all evolutionary processes)<ref name="kurzweil1999"/> increases exponentially, generalizing Moore's law in the same manner as Moravec's proposal, and also including material technology (especially as applied to [[nanotechnology]]), [[Medical Technology|medical technology]] and others.<ref name="kurzweil2005"/> Between 1986 and 2007, machines' application-specific capacity to compute information per capita roughly doubled every 14 months; the per capita capacity of the world's general-purpose computers has doubled every 18 months; the global telecommunication capacity per capita doubled every 34 months; and the world's storage capacity per capita doubled every 40 months.<ref name="HilbertLopez2011">[https://www.science.org/doi/10.1126/science.1200970 "The World's Technological Capacity to Store, Communicate, and Compute Information"] {{Webarchive|url=https://web.archive.org/web/20130727161911/http://www.sciencemag.org/content/332/6025/60|date=2013-07-27}}, Martin Hilbert and Priscila LΓ³pez (2011), [[Science (journal)|Science]], 332 (6025), pp. 60β65; free access to the article through: martinhilbert.net/WorldInfoCapacity.html.</ref> On the other hand, it has been argued that the global acceleration pattern having the 21st century singularity as its parameter should be characterized as [[Hyperbolic growth|hyperbolic]] rather than exponential.<ref>{{Cite journal |date=2020 |editor-last=Korotayev |editor-first=Andrey V. |editor2-last=LePoire |editor2-first=David J. |title=The 21st Century Singularity and Global Futures |url=https://link.springer.com/book/10.1007/978-3-030-33730-8 |journal=World-Systems Evolution and Global Futures |language=en |doi=10.1007/978-3-030-33730-8 |isbn=978-3-030-33729-2 |s2cid=241407141 |issn=2522-0985|url-access=subscription }}</ref> Kurzweil reserves the term "singularity" for a rapid increase in artificial intelligence (as opposed to other technologies), writing for example that "The Singularity will allow us to transcend these limitations of our biological bodies and brains ... There will be no distinction, post-Singularity, between human and machine".<ref name="kurzweil2005-9"/> He also defines his predicted date of the singularity (2045) in terms of when he expects computer-based intelligences to significantly exceed the sum total of human brainpower, writing that advances in computing before that date "will not represent the Singularity" because they do "not yet correspond to a profound expansion of our intelligence."<ref name="kurzweil2005-135136"/> ===Accelerating change=== [[File:ParadigmShiftsFrr15Events.svg|thumb|upright=2|According to Kurzweil, his [[logarithmic scale|logarithmic graph]] of 15 lists of [[paradigm shift]]s for key [[human history|historic]] events shows an [[exponential growth|exponential]] trend.]] {{Main|Accelerating change}} Some singularity proponents argue its inevitability through extrapolation of past trends, especially those pertaining to shortening gaps between improvements to technology. In one of the first uses of the term "singularity" in the context of technological progress, [[Stanislaw Ulam]] tells of a conversation with [[John von Neumann]] about accelerating change: {{blockquote|One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.<ref name="ulam1958"/>}} Kurzweil claims that technological progress follows a pattern of [[exponential growth]], following what he calls the "[[law of accelerating returns]]". Whenever technology approaches a barrier, Kurzweil writes, new technologies will surmount it. He predicts [[paradigm shift]]s will become increasingly common, leading to "technological change so rapid and profound it represents a rupture in the fabric of human history".<ref name="Kurzweil 2001">{{Citation |last=Kurzweil |first=Raymond |title=The Law of Accelerating Returns |journal=Nature Physics |volume=4 |issue=7 |page=507 |year=2001 |url=http://lifeboat.com/ex/law.of.accelerating.returns |access-date=2007-08-07 |archive-url=https://web.archive.org/web/20180827014027/https://lifeboat.com/ex/law.of.accelerating.returns |archive-date=2018-08-27 |url-status=live |publisher=Lifeboat Foundation |bibcode=2008NatPh...4..507B |doi=10.1038/nphys1010 |author-link=Raymond Kurzweil |doi-access=free}}.</ref> Kurzweil believes that the singularity will occur by approximately 2045.<ref name="kurzweil2005"/> His predictions differ from Vinge's in that he predicts a gradual ascent to the singularity, rather than Vinge's rapidly self-improving superhuman intelligence. Oft-cited dangers include those commonly associated with molecular nanotechnology and [[genetic engineering]]. These threats are major issues for both singularity advocates and critics, and were the subject of [[Bill Joy]]'s April 2000 ''[[Wired (magazine)|Wired]]'' magazine article "[[Why The Future Doesn't Need Us]]".<ref name="chalmers2010" /><ref name="Joy2000"/>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)