Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Technological singularity
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Accelerating change=== [[File:ParadigmShiftsFrr15Events.svg|thumb|upright=2|According to Kurzweil, his [[logarithmic scale|logarithmic graph]] of 15 lists of [[paradigm shift]]s for key [[human history|historic]] events shows an [[exponential growth|exponential]] trend.]] {{Main|Accelerating change}} Some singularity proponents argue its inevitability through extrapolation of past trends, especially those pertaining to shortening gaps between improvements to technology. In one of the first uses of the term "singularity" in the context of technological progress, [[Stanislaw Ulam]] tells of a conversation with [[John von Neumann]] about accelerating change: {{blockquote|One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.<ref name="ulam1958"/>}} Kurzweil claims that technological progress follows a pattern of [[exponential growth]], following what he calls the "[[law of accelerating returns]]". Whenever technology approaches a barrier, Kurzweil writes, new technologies will surmount it. He predicts [[paradigm shift]]s will become increasingly common, leading to "technological change so rapid and profound it represents a rupture in the fabric of human history".<ref name="Kurzweil 2001">{{Citation |last=Kurzweil |first=Raymond |title=The Law of Accelerating Returns |journal=Nature Physics |volume=4 |issue=7 |page=507 |year=2001 |url=http://lifeboat.com/ex/law.of.accelerating.returns |access-date=2007-08-07 |archive-url=https://web.archive.org/web/20180827014027/https://lifeboat.com/ex/law.of.accelerating.returns |archive-date=2018-08-27 |url-status=live |publisher=Lifeboat Foundation |bibcode=2008NatPh...4..507B |doi=10.1038/nphys1010 |author-link=Raymond Kurzweil |doi-access=free}}.</ref> Kurzweil believes that the singularity will occur by approximately 2045.<ref name="kurzweil2005"/> His predictions differ from Vinge's in that he predicts a gradual ascent to the singularity, rather than Vinge's rapidly self-improving superhuman intelligence. Oft-cited dangers include those commonly associated with molecular nanotechnology and [[genetic engineering]]. These threats are major issues for both singularity advocates and critics, and were the subject of [[Bill Joy]]'s April 2000 ''[[Wired (magazine)|Wired]]'' magazine article "[[Why The Future Doesn't Need Us]]".<ref name="chalmers2010" /><ref name="Joy2000"/>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)