Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Technological singularity
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Hypothetical point in time when technological growth becomes uncontrollable and irreversible}} {{Redirect|The Singularity||Singularity (disambiguation)}} {{Use dmy dates|date=March 2023}} {{Futures studies}} The '''technological singularity'''—or simply the '''singularity'''<ref>{{Cite news |last= Cadwalladr |first= Carole |date= 22 February 2014 |title= Are the robots about to rise? Google's new director of engineering thinks so… |work=[[The Guardian]] |url= https://www.theguardian.com/technology/2014/feb/22/robots-google-ray-kurzweil-terminator-singularity-artificial-intelligence |access-date= 8 May 2022}}</ref>—is a [[hypothetical]] point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable consequences for [[human civilization]].<ref>{{cite web |title= Collection of sources defining "singularity" |url= http://www.singularitysymposium.com/definition-of-singularity.html |url-status= dead |archive-url= https://web.archive.org/web/20190417002644/http://www.singularitysymposium.com/definition-of-singularity.html |archive-date=17 April 2019 |access-date=17 April 2019 |website= singularitysymposium.com}}</ref><ref name="Singularity hypotheses">{{cite book |title=Singularity Hypotheses: A Scientific and Philosophical Assessment |series=The Frontiers Collection |year= 2012 |publisher= Springer |isbn= 9783642325601 |editor-last= Eden |editor-first= Amnon H. |location= Dordrecht |pages= 1–2 |doi= 10.1007/978-3-642-32560-1 |url= https://cds.cern.ch/record/1552240 |editor-last2= Moor |editor-first2= James H. |editor-last3= Søraker |editor-first3= Johnny H. |editor-last4= Steinhart |editor-first4= Eric}}</ref> According to the most popular version of the singularity hypothesis, [[I. J. Good]]'s [[#Intelligence explosion|intelligence explosion]] model of 1965, an upgradable [[intelligent agent]] could eventually enter a positive feedback loop of successive [[Recursive self-improvement|self-improvement]] cycles; more intelligent generations would appear more and more rapidly, causing a rapid increase ("explosion") in intelligence which would culminate in a powerful [[superintelligence]], far surpassing all [[human intelligence]].<ref name="vinge1993">Vinge, Vernor. [http://mindstalk.net/vinge/vinge-sing.html "The Coming Technological Singularity: How to Survive in the Post-Human Era"] {{Webarchive|url= https://web.archive.org/web/20180410074243/http://mindstalk.net/vinge/vinge-sing.html|date= 2018-04-10}}, in ''Vision-21: Interdisciplinary Science and Engineering in the Era of Cyberspace'', G. A. Landis, ed., NASA Publication CP-10129, pp. 11–22, 1993. - "There may be developed computers that are "awake" and superhumanly intelligent. (To date, there has been much controversy as to whether we can create human equivalence in a machine. But if the answer is 'yes, we can', then there is little doubt that beings more intelligent can be constructed shortly thereafter.)"</ref> The Hungarian-American mathematician [[John von Neumann]] (1903-1957) became the first known person to use the concept of a "singularity" in the technological context.<ref>{{Cite book |last=Vinge |first=Vernor |author-link=Vernor Vinge |date=1993 |chapter=The Coming Technological Singularity: How to Survive in the Post-Human Era |chapter-url=https://ntrs.nasa.gov/api/citations/19940022855/downloads/19940022855.pdf |title=Proceedings of a symposium cosponsored by the NASA Lewis Research Center and the Ohio Aerospace Institute and held in Westlake, Ohio March 30-31, 1993 |series=NASA Conference Publication |volume=10129 |page=11 |bibcode=1993vise.nasa...11V }}</ref><ref>{{Cite book |last=Shanahan |first=Murray |url=https://books.google.com/books?id=rAxZCgAAQBAJ |title=The Technological Singularity |date=2015-08-07 |publisher=MIT Press |isbn=978-0-262-52780-4 |page=233 |language=en}}</ref> [[Alan Turing]], often regarded as the father of modern computer science, laid a crucial foundation for contemporary discourse on the technological singularity. His pivotal 1950 paper, "[[Computing Machinery and Intelligence]]", introduced the idea of a machine's ability to exhibit intelligent behavior equivalent to or indistinguishable from that of a human.<ref>{{Cite web |date=2024-08-13 |title=What is the Techological Singularity? {{!}} IBM |url=https://www.ibm.com/think/topics/technological-singularity |access-date=2024-11-14 |website=www.ibm.com |language=en}}</ref> [[Stanislaw Ulam]] reported in 1958 an earlier discussion with von Neumann "centered on the [[Accelerating change|accelerating progress]] of technology and changes in human life, which gives the appearance of approaching some essential [[Wiktionary:singularity|singularity]] in the history of the race beyond which human affairs, as we know them, could not continue".<ref name="ulam1958" /> Subsequent authors have echoed this viewpoint.<ref name="Singularity hypotheses" /><ref name="chalmers2010" /> The concept and the term "singularity" were popularized by [[Vernor Vinge]]: first in 1983, in an article that claimed that, once humans create intelligences greater than their own, there will be a technological and social transition similar in some sense to "the knotted space-time at the center of a black hole";<ref name="dooling2008-88"/> and later in his 1993 essay "The Coming Technological Singularity",<ref name="vinge1993" /><ref name="chalmers2010"/> in which he wrote that it would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate, and he would be surprised if it occurred before 2005 or after 2030.<ref name="vinge1993"/> Another significant contribution to wider circulation of the notion was [[Ray Kurzweil]]'s 2005 book ''[[The Singularity Is Near]]'', predicting singularity by 2045.<ref name="chalmers2010"/> <!-- Human extinction: -->Some scientists, including [[Stephen Hawking]], have expressed concerns that [[Superintelligence|artificial superintelligence]] (ASI) could result in human extinction.<ref>{{cite news|last1= Sparkes|first1= Matthew|title= Top scientists call for caution over artificial intelligence|url= https://www.telegraph.co.uk/technology/news/11342200/Top-scientists-call-for-caution-over-artificial-intelligence.html|access-date= 24 April 2015|work= [[The Daily Telegraph|The Telegraph (UK)]]|date= 13 January 2015|archive-date= 7 April 2015|archive-url= https://web.archive.org/web/20150407191839/http://www.telegraph.co.uk/technology/news/11342200/Top-scientists-call-for-caution-over-artificial-intelligence.html|url-status= live}}</ref><ref>{{cite web|url= https://www.bbc.com/news/technology-30290540|title= Hawking: AI could end human race|date= 2 December 2014|publisher= BBC|access-date= 11 November 2017|archive-date= 30 October 2015|archive-url= https://web.archive.org/web/20151030054329/http://www.bbc.com/news/technology-30290540|url-status=live}}</ref> The consequences of a technological singularity and its potential benefit or harm to the human race have been intensely debated.{{citation needed|date=April 2025}} <!-- Plausibility: -->Prominent technologists and academics dispute the plausibility of a technological singularity and the associated artificial intelligence explosion, including [[Paul Allen]],<ref name="Allen2011"/> [[Jeff Hawkins]],<ref name="ieee-lumi"/> [[John Henry Holland|John Holland]], [[Jaron Lanier]], [[Steven Pinker]],<ref name="ieee-lumi"/> [[Theodore Modis]],<ref name="modis2012"/> [[Gordon Moore]],<ref name="ieee-lumi" /> and [[Roger Penrose]].<ref>{{Cite book |last=Penrose |first=Roger |title=The emperor's new mind: concerning computers, minds and the laws of physics |date=1999 |publisher=Oxford Univ. Press |isbn=978-0-19-286198-6 |location=Oxford}}</ref> One claim made was that artificial intelligence growth is likely to run into decreasing returns instead of accelerating ones, as was observed in previously developed human technologies.{{citation needed|date=April 2025}}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)