Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Technological singularity
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Criticism== Some critics, like philosophers [[Hubert Dreyfus]]<ref name="dreyfus2000"/> and [[John Searle]],<ref>John R. Searle, “What Your Computer Can’t Know”, ''The New York Review of Books'', 9 October 2014, p. 54.: "[Computers] have, literally ..., no intelligence, no motivation, no autonomy, and no agency. We design them to behave as if they had certain sorts of psychology, but there is no psychological reality to the corresponding processes or behavior. ... [T]he machinery has no beliefs, desires, [or] motivations."</ref> assert that computers or machines cannot achieve [[human intelligence]]. Others, like physicist [[Stephen Hawking]],<ref name="hawking2018"/> object that whether machines can achieve a true intelligence or merely something similar to intelligence is irrelevant if the net result is the same. Psychologist [[Steven Pinker]] stated in 2008: "There is not the slightest reason to believe in a coming singularity. The fact that you can visualize a future in your imagination is not evidence that it is likely or even possible. Look at domed cities, jet-pack commuting, underwater cities, mile-high buildings, and nuclear-powered automobiles—all staples of futuristic fantasies when I was a child that have never arrived. Sheer processing power is not a pixie dust that magically solves all your problems."<ref name="ieee-lumi"/> [[Martin Ford (author)|Martin Ford]]<ref name="ford2009"/> postulates a "technology paradox" in that before the singularity could occur most routine jobs in the economy would be automated, since this would require a level of technology inferior to that of the singularity. This would cause massive unemployment and plummeting consumer demand, which in turn would destroy the incentive to invest in the technologies that would be required to bring about the singularity. Job displacement is increasingly no longer limited to those types of work traditionally considered to be "routine".<ref name="markoff2011"/> <!-- Rate of technological innovation: -->[[Theodore Modis]]<ref name="modis2002"/> and [[Jonathan Huebner]]<ref name="huebner2005"/> argue that the rate of technological innovation has not only ceased to rise, but is actually now declining. Evidence for this decline is that the rise in computer [[clock rate]]s is slowing, even while Moore's prediction of exponentially increasing circuit density continues to hold. This is due to excessive heat build-up from the chip, which cannot be dissipated quickly enough to prevent the chip from melting when operating at higher speeds. Advances in speed may be possible in the future by virtue of more power-efficient CPU designs and multi-cell processors.<ref name="krazit2006"/> <!-- Modis specifically on "singularity": -->[[Theodore Modis]] holds the singularity cannot happen.<ref>Modis, Theodore (2020). “Forecasting the Growth of Complexity and Change—An Update”. Published in {{cite book |last1=Korotayev |first1=Andrey |title=The 21st Century Singularity and Global Futures |last2=LePoire |first2=David (Eds.) |date=January 3, 2020 |publisher=Springer |isbn=978-3-030-33730-8 |edition=1 |pages=620}} pp/ 101–104.</ref><ref name="modis2012">Modis, Theodore (2012). “Why the Singularity Cannot Happen”. Published in {{cite book |last1=Eden |first1=Amnon H. et al (Eds.) |url=http://www.growth-dynamics.com/articles/Singularity.pdf |title=Singularity Hypothesis |date=2012 |publisher=Springer |isbn=978-3-642-32560-1 |location=New York |page=311}} pp. 311–339.</ref><ref name="modis2003">Modis, Theodore (May–June 2003). “[http://www.growth-dynamics.com/articles/futurist.pdf The Limits of Complexity and Change]”. The Futurist. 37 (3): 26–32.</ref> He claims the "technological singularity" and especially Kurzweil lack scientific rigor; Kurzweil is alleged to mistake the logistic function (S-function) for an exponential function, and to see a "knee" in an exponential function where there can in fact be no such thing.<ref name="modis2006"/> In a 2021 article, Modis pointed out that no milestones{{snd}}breaks in historical perspective comparable in importance to the Internet, DNA, the transistor, or nuclear energy{{snd}}had been observed in the previous twenty years while five of them would have been expected according to the exponential trend advocated by the proponents of the technological singularity.<ref name="modis2022">{{Cite journal |last=Modis |first=Theodore |date=2022-03-01 |title=Links between entropy, complexity, and the technological singularity |url=https://www.sciencedirect.com/science/article/pii/S0040162521008921 |journal=Technological Forecasting and Social Change |language=en |volume=176 |pages=121457 |doi=10.1016/j.techfore.2021.121457 |s2cid=245663426 |issn=0040-1625|arxiv=2410.10844 }}</ref> AI researcher [[Jürgen Schmidhuber]] stated that the frequency of subjectively "notable events" appears to be approaching a 21st-century singularity, but cautioned readers to take such plots of subjective events with a grain of salt: perhaps differences in memory of recent and distant events could create an illusion of accelerating change where none exists.<ref>{{Citation |last=Schmidhuber |first=Jürgen |title=New millennium AI and the convergence of history |year=2006 |arxiv=cs/0606081 |bibcode=2006cs........6081S}}.</ref> Microsoft co-founder [[Paul Allen]] argued the opposite of accelerating returns, the complexity brake:<ref name="Allen2011"/> the more progress science makes towards understanding intelligence, the more difficult it becomes to make additional progress. A study of the number of patents shows that human creativity does not show accelerating returns, but in fact, as suggested by [[Joseph Tainter]] in his ''The Collapse of Complex Societies'',<ref name="tainter1988"/> a law of [[diminishing returns]]. The number of patents per thousand peaked in the period from 1850 to 1900, and has been declining since.<ref name="huebner2005" /><!--[Previous comment: is this from 'Collapse of Complex Societies' or some other source? Perhaps this refers to Jonathan Huebner's patent analysis mentioned in the earlier paragraph? If so, would be better to integrate this part with that paragraph, since the earlier paragraph mentions that Huebner's analysis has been criticized whereas this paragraph just seems to present it as fact --> The growth of complexity eventually becomes self-limiting, and leads to a widespread "general systems collapse". [[Douglas Hofstadter|Hofstadter]] (2006) raises concern that Ray Kurzweil is not sufficiently scientifically rigorous, that an exponential tendency of technology is not a scientific law like one of physics, and that exponential curves have no "knees".<ref>[https://www.youtube.com/watch?v=Nhj6fDDnckE Trying to Muse Rationally About the Singularity Scenario] by Douglas Hofstadter, 2006, [https://web.archive.org/web/20170109020308/https://medium.com/@emergingtechnology/trying-to-muse-rationally-about-the-singularity-scenario-9c9db2eb9ece unauthorized transcript].</ref> Nonetheless, he did not rule out the singularity in principle in the distant future<ref name="ieee-lumi"/> and in the light of [[ChatGPT]] and other recent advancements has revised his opinion significantly towards dramatic technological change in the near future.<ref>{{Cite news |last=Brooks |first=David |date=2023-07-13 |title=Opinion {{!}} 'Human Beings Are Soon Going to Be Eclipsed' |language=en-US |work=The New York Times |url=https://www.nytimes.com/2023/07/13/opinion/ai-chatgpt-consciousness-hofstadter.html |access-date=2023-08-02 |issn=0362-4331}}</ref> [[Jaron Lanier]] denies that the singularity is inevitable: "I do not think the technology is creating itself. It's not an autonomous process."<ref name="lanier">{{cite web |author=Lanier |first=Jaron |date=2013 |title=Who Owns the Future? |url=http://www.epubbud.com/read.php?g=JCB8D9LA&tocp=59 |archive-url=https://web.archive.org/web/20160513131523/http://www.epubbud.com/read.php?g=JCB8D9LA&tocp=59 |archive-date=2016-05-13 |access-date=2016-03-02 |work=New York: Simon & Schuster}}</ref> Furthermore: "The reason to believe in human agency over technological determinism is that you can then have an economy where people earn their own way and invent their own lives. If you structure a society on ''not'' emphasizing individual human agency, it's the same thing operationally as denying people clout, dignity, and [[self-determination]] ... to embrace [the idea of the Singularity] would be a celebration of bad data and bad politics."<ref name="lanier" /> Economist [[Robert J. Gordon]] points out that measured economic growth slowed around 1970 and slowed even further since the [[2008 financial crisis]], and argues that the economic data show no trace of a coming Singularity as imagined by mathematician [[I. J. Good]].<ref>[[William D. Nordhaus]], "Why Growth Will Fall" (a review of [[Robert J. Gordon]], ''The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War'', Princeton University Press, 2016, {{ISBN|978-0691147727}}, 762 pp., $39.95), ''[[The New York Review of Books]]'', vol. LXIII, no. 13 (August 18, 2016), p. 68.</ref> Philosopher and cognitive scientist [[Daniel Dennett]] said in 2017: "The whole singularity stuff, that's preposterous. It distracts us from much more pressing problems", adding "AI tools that we become hyper-dependent on, that is going to happen. And one of the dangers is that we will give them more authority than they warrant."<ref>{{Citation |last=Cadwalladr |first=Carole |title=Daniel Dennett: 'I begrudge every hour I have to spend worrying about politics' |date=12 February 2017 |work=[[The Guardian]] |url=https://www.theguardian.com/science/2017/feb/12/daniel-dennett-politics-bacteria-bach-back-dawkins-trump-interview |author-link=Carole Cadwalladr}}.</ref> In addition to general criticisms of the singularity concept, several critics have raised issues with Kurzweil's iconic chart. One line of criticism is that a [[Log-log plot|log-log]] chart of this nature is inherently biased toward a straight-line result. Others identify selection bias in the points that Kurzweil chooses to use. For example, biologist [[PZ Myers]] points out that many of the early evolutionary "events" were picked arbitrarily.<ref name="PZMyers2009"/> Kurzweil has rebutted this by charting evolutionary events from 15 neutral sources, and showing that they fit a straight line on [[:File:ParadigmShiftsFrr15Events.svg|a log-log chart]]. [[Kevin Kelly (editor)|Kelly]] (2006) argues that the way the Kurzweil chart is constructed with x-axis having time before present, it always points to the singularity being "now", for any date on which one would construct such a chart, and shows this visually on Kurzweil's chart.<ref>{{Cite web |last=Kelly |first=Kevin |date=2006 |title=The Singularity Is Always Near |url=https://kk.org/thetechnium/the-singularity/ |access-date=2023-06-14 |website=The Technium}}</ref> <!-- religious semblance: -->Some critics suggest religious motivations or implications of singularity, especially Kurzweil's version of it. The buildup towards the singularity is compared with Christian end-of-time scenarios. Beam calls it "a [[Buck Rogers]] vision of the hypothetical Christian Rapture".<ref name="beam2005">{{cite news|last=Beam|first=Alex|title=That Singularity Sensation|url=http://www.boston.com/ae/books/articles/2005/02/24/that_singularity_sensation/|access-date=2013-02-15|newspaper=The Boston Globe|date=2005-02-24}}</ref> [[John Gray (philosopher)|John Gray]] says "the Singularity echoes apocalyptic myths in which history is about to be interrupted by a world-transforming event".<ref name="gray2011">{{cite magazine|last=Gray|first=John|title=On the Road to Immortality|magazine=The New York Review of Books|date=2011-11-24|url=http://www.nybooks.com/articles/archives/2011/nov/24/road-immortality/?pagination=false|access-date=2013-03-19}}</ref> [[David Streitfeld]] in ''[[The New York Times]]'' questioned whether "it might manifest first and foremost—thanks, in part, to the bottom-line obsession of today’s [[Silicon Valley]]—as a tool to slash corporate America’s head count."<ref>{{Cite web |last=Streitfeld |first=David |date=11 June 2023 |title=Silicon Valley Confronts the Idea That the 'Singularity' Is Here |url=https://www.nytimes.com/2023/06/11/technology/silicon-valley-confronts-the-idea-that-the-singularity-is-here.html |access-date=11 June 2023 |website=New York Times}}</ref> Astrophysicist and [[Philosophy of Science|scientific philosopher]] [[Adam Becker]] debunks Kurzweil's concept of human mind uploads to computers on the grounds that they are too fundamentally different and incompatible.<ref>{{Cite journal |last=Wood |first=Andrew Paul |date=May 17, 2025 |title=Mission Critical |journal=[[New Zealand Listener]] |pages=38–39}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)