Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Technology
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Existential risk === {{main|Global catastrophic risk}} Existential risk researchers analyze risks that could lead to [[human extinction]] or civilizational collapse, and look for ways to build resilience against them.<ref>{{Cite web |title=About us |url=https://www.cser.ac.uk/about-us/ |access-date=11 September 2022 |website=cser.ac.uk |archive-date=30 December 2017 |archive-url=https://web.archive.org/web/20171230172611/https://www.cser.ac.uk/about-us/ |url-status=live }}</ref><ref name=":0">{{Cite journal |last=Gottlieb |first=J. |date=1 May 2022 |title=Discounting, Buck-Passing, and Existential Risk Mitigation: The Case of Space Colonization |url=https://www.sciencedirect.com/science/article/pii/S0265964622000121 |journal=Space Policy |volume=60 |page=101486 |bibcode=2022SpPol..6001486G |doi=10.1016/j.spacepol.2022.101486 |issn=0265-9646 |s2cid=247718992|url-access=subscription }}</ref> Relevant research centers include the [[Centre for the Study of Existential Risk|Cambridge Center for the Study of Existential Risk]], and the Stanford Existential Risk Initiative.<ref>{{Cite web |last= |first= |last2= |last3= |title=Stanford Existential Risks Initiative |url=https://cisac.fsi.stanford.edu/stanford-existential-risks-initiative/content/stanford-existential-risks-initiative |access-date=4 October 2022 |website=cisac.fsi.stanford.edu |archive-date=22 September 2022 |archive-url=https://web.archive.org/web/20220922150116/https://cisac.fsi.stanford.edu/stanford-existential-risks-initiative/content/stanford-existential-risks-initiative |url-status=live }}</ref> Future technologies may contribute to the risks of [[artificial general intelligence]], [[biological warfare]], [[nuclear warfare]], [[nanotechnology]], [[anthropogenic climate change]], [[global warming]], or stable global [[totalitarianism]], though technologies may also help us mitigate [[Impact event|asteroid impacts]] and [[gamma-ray burst]]s.<ref>{{Cite book |last1=Bostrom |first1=Nick |url=https://books.google.com/books?id=sTkfAQAAQBAJ |title=Global Catastrophic Risks |last2=Cirkovic |first2=Milan M. |year=2011 |publisher=OUP Oxford |isbn=978-0199606504|access-date=11 September 2022 |archive-date=4 October 2022 |archive-url=https://web.archive.org/web/20221004185315/https://books.google.com/books?id=sTkfAQAAQBAJ |url-status=live }}</ref> In 2019 philosopher [[Nick Bostrom]] introduced the notion of a ''vulnerable world'', "one in which there is some level of technological development at which civilization almost certainly gets devastated by default", citing the risks of a [[pandemic]] caused by [[Bioterrorism|bioterrorists]], or an [[arms race]] triggered by the development of novel armaments and the loss of [[mutual assured destruction]].<ref name="Bostrom 2019">{{Cite journal |last=Bostrom |first=Nick |date=6 September 2019 |title=The Vulnerable World Hypothesis |journal=Global Policy |volume=10 |issue=4 |pages=455β476 |doi=10.1111/1758-5899.12718 |issn=1758-5880 |s2cid=203169705|doi-access=free }}</ref> He invites policymakers to question the assumptions that technological progress is always beneficial, that scientific openness is always preferable, or that they can afford to wait until a dangerous technology has been invented before they prepare mitigations.<ref name="Bostrom 2019" />
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)