Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Human extinction
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Hypothetical end of the human species}} {{Redirect|Omnicide||Omnicide (disambiguation)}} {{hatnote|For methodological challenges quantifying and mitigating the risk, proposed mitigation measures, and related organizations, see [[Global catastrophic risk]].}} {{Use American English|date=August 2021}} {{Use mdy dates|date=October 2021}} [[File:The explosion of the hydrogen bomb Ivy Mike.jpg|thumb|[[Nuclear war]] is an often-predicted cause of the extinction of humankind.<ref>{{Cite web |last=Di Mardi |date=October 15, 2020 |title=The grim fate that could be 'worse than extinction' |url=https://www.bbc.com/future/article/20201014-totalitarian-world-in-chains-artificial-intelligence |access-date=November 11, 2020 |website=BBC News |quote=When we think of existential risks, events like nuclear war or asteroid impacts often come to mind.}}</ref>]] '''Human extinction''' or '''omnicide''' is the hypothetical [[Extinction|end]] of the [[Human|human species]], either by [[population decline]] due to extraneous natural causes, such as an [[impact event|asteroid impact]] or [[Supervolcano|large-scale volcanism]], or via [[anthropogenic hazard|anthropogenic destruction]] (self-extinction). Some of the [[Global catastrophic risk|many possible contributors]] to anthropogenic hazard are [[Effects of global warming on humans|climate change]], [[Nuclear holocaust|global nuclear annihilation]], [[biological warfare]], [[weapon of mass destruction|weapons of mass destruction]], and [[ecological collapse]]. Other scenarios center on emerging technologies, such as [[Existential risk from artificial general intelligence|advanced artificial intelligence]], [[biotechnology]], or [[Gray goo|self-replicating nanobots]]. The scientific consensus is that there is a relatively low risk of near-term human extinction due to natural causes.<ref name=":13">{{Cite journal |last1=Snyder-Beattie |first1=Andrew E. |last2=Ord |first2=Toby |last3=Bonsall |first3=Michael B. |date=July 30, 2019 |title=An upper bound for the background rate of human extinction |journal=Scientific Reports |volume=9 |issue=1 |pages=11054 |bibcode=2019NatSR...911054S |doi=10.1038/s41598-019-47540-7 |issn=2045-2322 |pmc=6667434 |pmid=31363134}}</ref>{{sfn|Bostrom|2013}} The likelihood of human extinction through humankind's own activities, however, is a current area of research and debate. == History of thought == === Early history === Before the 18th and 19th centuries, the possibility that humans or other organisms could become extinct was viewed with scepticism.{{r|Moynihan}} It contradicted the [[principle of plenitude]], a doctrine that all possible things exist.{{r|Moynihan}} The principle traces back to [[Aristotle]], and was an important tenet of Christian theology.<ref name="Darwin">{{Cite book |last1=Darwin |first1=Charles |title=The Annotated Origin |last2=Costa |first2=James T. |date=2009 |publisher=Harvard University Press |isbn=978-0674032811 |page=121}}</ref> Ancient philosophers such as [[Plato]], Aristotle, and [[Lucretius]] wrote of the end of humankind only as part of a cycle of renewal. [[Marcion of Sinope]] was a proto-protestant who advocated for [[antinatalism]] that could lead to human extinction.<ref name="Moll 2010 p. 132">{{cite book | last=Moll | first=S. | title=The Arch-heretic Marcion | publisher=Mohr Siebeck | series=Wissenschaftliche Untersuchungen zum Neuen Testament | year=2010 | isbn=978-3-16-150268-2 | url=https://books.google.com/books?id=P3DGtdAYB9oC&pg=PA132 | access-date=2023-06-11 | page=132}}</ref><ref name="Welchman 2014 p. 21">{{cite book | last=Welchman | first=A. | title=Politics of Religion/Religions of Politics | publisher=Springer Netherlands | series=Sophia Studies in Cross-cultural Philosophy of Traditions and Cultures | year=2014 | isbn=978-94-017-9448-0 | url=https://books.google.com/books?id=-8zlBAAAQBAJ&pg=PA21 | access-date=2023-06-11 | page=21}}</ref> Later philosophers such as [[Al-Ghazali]], [[William of Ockham]], and [[Gerolamo Cardano]] expanded the study of [[logic]] and [[probability]] and began wondering if abstract worlds existed, including a world without humans. Physicist [[Edmond Halley]] stated that the extinction of the human race may be beneficial to the future of the world.<ref name="Moynihan 2020 p. 56">{{cite book | last=Moynihan | first=T. | title=X-Risk: How Humanity Discovered Its Own Extinction | publisher=MIT Press | year=2020 | isbn=978-1-913029-84-5 | url=https://books.google.com/books?id=7oUBEAAAQBAJ&pg=PA56 | access-date=2022-10-19 | page=56}}</ref> The notion that species can become extinct gained scientific acceptance during the [[Age of Enlightenment]] in the 17th and 18th centuries, and by 1800 [[Georges Cuvier]] had identified 23 extinct prehistoric species.{{r|Moynihan}} The doctrine was further gradually undermined by evidence from the natural sciences, particularly the discovery of fossil evidence of species that appeared to no longer exist, and the development of theories of evolution.<ref name="Darwin"/> In ''[[On the Origin of Species]]'', [[Charles Darwin]] discussed the extinction of species as a natural process and a core component of natural selection.<ref name="Raup">{{Cite book |last=Raup |first=David M. |title=Tempo And Mode in Evolution: Genetics And Paleontology 50 Years After Simpson |publisher=National Academies Press (US) |year=1995 |editor-last=Fitch |editor-first=W. M. |chapter=The Role of Extinction in Evolution |editor-last2=Ayala |editor-first2=F. J. |chapter-url=https://www.ncbi.nlm.nih.gov/books/NBK232212/}}</ref> Notably, Darwin was skeptical of the possibility of sudden extinction, viewing it as a gradual process. He held that the abrupt disappearances of species from the fossil record were not evidence of catastrophic extinctions, but rather represented unrecognised gaps{{clarify|date=October 2023}} in the record.<ref name="Raup" /> As the possibility of extinction became more widely established in the sciences, so did the prospect of human extinction.{{r|Moynihan}} In the 19th century, human extinction became a popular topic in science (e.g., [[Thomas Robert Malthus]]'s ''[[An Essay on the Principle of Population]]'') and fiction (e.g., [[Jean-Baptiste Cousin de Grainville]]'s ''[[Le Dernier Homme|The Last Man]]''). In 1863, a few years after Darwin published ''On the Origin of Species'', [[William King (geologist)|William King]] proposed that [[Neanderthal]]s were an extinct species of the genus ''[[Homo]]''. The [[Romanticism|Romantic]] authors and poets were particularly interested in the topic.{{r|Moynihan}} [[Lord Byron]] wrote about the extinction of life on Earth in his 1816 poem "[[Darkness (poem)|Darkness]]", and in 1824 envisaged humanity being threatened by a comet impact, and employing a missile system to defend against it.{{r|Moynihan}} [[Mary Shelley]]'s 1826 novel ''[[The Last Man (Mary Shelley novel)|The Last Man]]'' is set in a world where humanity has been nearly destroyed by a mysterious plague.<ref name="Moynihan" /> At the turn of the 20th century, [[Russian cosmism]], a precursor to modern [[transhumanism]], advocated avoiding humanity's extinction by colonizing space.{{r|Moynihan}} === Atomic era === [[File:Operation Castle - Romeo 001.jpg|thumb|upright|''[[Castle Romeo]]'' nuclear test on [[Bikini Atoll]]]] The invention of the atomic bomb prompted a wave of discussion among scientists, intellectuals, and the public at large about the risk of human extinction.{{r|Moynihan}} In a 1945 essay, [[Bertrand Russell]] wrote: <blockquote>The prospect for the human race is sombre beyond all precedent. Mankind are faced with a clear-cut alternative: either we shall all perish, or we shall have to acquire some slight degree of common sense.<ref>{{Cite web |last=Russell |first=Bertrand |date=1945 |title=The Bomb and Civilization |url=http://www.personal.kent.edu/~rmuhamma/Philosophy/RBwritings/bombCivilization.htm |archive-url=https://web.archive.org/web/20200807144106/http://www.personal.kent.edu/~rmuhamma/Philosophy/RBwritings/bombCivilization.htm |archive-date=August 7, 2020}}</ref> </blockquote>In 1950, [[Leo Szilard]] suggested it was technologically feasible to build a [[cobalt bomb]] that could render the planet unlivable. A 1950 Gallup poll found that 19% of Americans believed that another world war would mean "an end to mankind".<ref name="Erskine">{{Cite journal |last=Erskine |first=Hazel Gaudet |date=1963 |title=The Polls: Atomic Weapons and Nuclear Energy |url=https://www.jstor.org/stable/2746913 |journal=The Public Opinion Quarterly |volume=27 |pages=155β190 |doi=10.1086/267159 |jstor=2746913 |number=2|url-access=subscription }}</ref> [[Rachel Carson]]'s 1962 book ''[[Silent Spring]]'' raised awareness of environmental catastrophe. In 1983, [[Brandon Carter]] proposed the [[Doomsday argument]], which used [[Bayesian probability]] to predict the total number of humans that will ever exist. The discovery of "[[nuclear winter]]" in the early 1980s, a specific mechanism by which nuclear war could result in human extinction, again raised the issue to prominence. Writing about these findings in 1983, [[Carl Sagan]] argued that measuring the severity of extinction solely in terms of those who die "conceals its full impact", and that nuclear war "imperils all of our descendants, for as long as there will be humans."<ref>{{Cite news |last=Sagan |first=Carl |date=January 28, 2009 |title=Nuclear War and Climatic Catastrophe: Some Policy Implications |url=https://www.foreignaffairs.com/articles/1983-12-01/nuclear-war-and-climatic-catastrophe-some-policy-implications |access-date=August 11, 2021 |doi=10.2307/20041818 |jstor=20041818}}</ref> === Post-Cold War === [[John A. Leslie|John Leslie]]'s 1996 book ''The End of The World'' was an academic treatment of the science and ethics of human extinction. In it, Leslie considered a range of threats to humanity and what they have in common. In 2003, British [[Astronomer Royal]] Sir [[Martin Rees]] published ''[[Our Final Hour]]'', in which he argues that advances in certain technologies create new threats to the survival of humankind and that the 21st century may be a critical moment in history when humanity's fate is decided.<ref name=":12">{{Cite book |last=Reese |first=Martin |title=Our Final Hour: A Scientist's Warning: How Terror, Error, and Environmental Disaster Threaten Humankind's Future In This Century β On Earth and Beyond |publisher=[[Basic Books]] |year=2003 |isbn=0-465-06863-4 |language=en}}</ref> Edited by [[Nick Bostrom]] and [[Milan M. ΔirkoviΔ]], ''[[Global Catastrophic Risks (book)|Global Catastrophic Risks]]'' was published in 2008, a collection of essays from 26 academics on various global catastrophic and existential risks.<ref>{{Cite book |title=Global catastrophic risks |publisher=[[Oxford University Press]] |year=2008 |isbn=978-0199606504 |editor-last=Bostrom |editor-first=Nick |editor-link=Nick Bostrom |editor-last2=ΔirkoviΔ |editor-first2=Milan M. |editor-link2=Milan M. ΔirkoviΔ}}</ref> [[Toby Ord]]'s 2020 book ''[[The Precipice: Existential Risk and the Future of Humanity]]'' argues that preventing existential risks is one of the most important moral issues of our time. The book discusses, quantifies, and compares different existential risks, concluding that the greatest risks are presented by unaligned artificial intelligence and biotechnology.<ref name="Ord 2020q">{{Cite book |last=Ord |first=Toby |title=The Precipice: Existential Risk and the Future of Humanity |date=2020 |publisher=Hachette |isbn=9780316484916 |location=New York |at=4:15β31 |language=en-us |quote=This is an equivalent, though crisper statement of [[Nick Bostrom]]'s definition: "An existential risk is one that threatens the premature extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable future development." Source: Bostrom, Nick (2013). "Existential Risk Prevention as Global Priority". Global Policy.}}</ref> == Causes == {{main|Global catastrophe scenarios|}} Potential anthropogenic causes of human extinction include [[Nuclear warfare|global thermonuclear war]], deployment of a highly effective [[Biological agent|biological weapon]], ecological collapse, [[Existential risk from artificial general intelligence|runaway artificial intelligence]], runaway [[nanotechnology]] (such as a [[Gray goo|grey goo]] scenario), [[Human overpopulation|overpopulation]] and [[overconsumption|increased consumption]] causing resource depletion and a concomitant population crash, [[population decline]] by choosing to have fewer children, and displacement of naturally evolved humans by a new species produced by [[genetic engineering]] or technological augmentation. Natural and external extinction risks include high-fatality-rate [[pandemic]], [[Supervolcano|supervolcanic]] [[Types of volcanic eruptions|eruption]], [[Impact event|asteroid impact]], nearby [[supernova]] or [[gamma-ray burst]], or extreme [[solar flare]]. Humans (e.g. ''[[Homo sapiens sapiens]]'') as a species may also be considered to have "gone extinct" simply by being replaced with distant descendants whose continued [[evolution]] may produce new species or subspecies ''[[Homo]]'' or of [[hominid]]s. Without intervention by unexpected forces, the [[stellar evolution]] of the Sun [[Future of Earth#Solar evolution|is expected]] to make Earth uninhabitable, then destroy it. Depending on [[ultimate fate of the universe|its ultimate fate]], the entire universe may eventually become uninhabitable. == Probability == === Natural vs. anthropogenic === Experts generally agree that anthropogenic existential risks are (much) more likely than natural risks.<ref name="Ord 20202" /><ref name=":12"/><ref name="fhi2">{{Cite journal |last1=Bostrom |first1=Nick |author-link=Nick Bostrom |last2=Sandberg |first2=Anders |author-link2=Anders Sandberg |date=2008 |title=Global Catastrophic Risks Survey |url=https://www.fhi.ox.ac.uk/reports/2008-1.pdf |journal=FHI Technical Report #2008-1 |publisher=[[Future of Humanity Institute]]}}</ref><ref name=":13"/><ref name="Frequently Asked Questions">{{Cite web |title=Frequently Asked Questions |url=http://existential-risk.org/faq.html |access-date=July 26, 2013 |website=Existential Risk |publisher=[[Future of Humanity Institute]] |quote="The great bulk of existential risk in the foreseeable future is anthropogenic; that is, arising from human activity."}}</ref> A key difference between these risk types is that empirical evidence can place an upper bound on the level of natural risk.<ref name=":13" /> Humanity has existed for at least 200,000 years, over which it has been subject to a roughly constant level of natural risk. If the natural risk were sufficiently high, then it would be highly unlikely that humanity would have survived as long as it has. Based on a formalization of this argument, researchers have concluded that we can be confident that natural risk is lower than 1 in 14,000 per year (equivalent to 1 in 140 per century, on average).<ref name=":13" /> Another empirical method to study the likelihood of certain natural risks is to investigate the geological record.<ref name="Ord 20202" /> For example, a [[impact event|comet or asteroid impact event]] sufficient in scale to cause an [[impact winter]] that would cause human extinction before the year 2100 has been estimated at one-in-a-million.<ref name="matheny">{{Cite journal |last=Matheny |first=Jason Gaverick |date=2007 |title=Reducing the Risk of Human Extinction |journal=Risk Analysis |volume=27 |issue=5 |pages=1335β1344 |doi=10.1111/j.1539-6924.2007.00960.x |pmid=18076500 |bibcode=2007RiskA..27.1335M |s2cid=14265396 |url=http://users.physics.harvard.edu/~wilson/pmpmta/Mahoney_extinction.pdf |access-date=July 1, 2016 |archive-date=August 27, 2014 |archive-url=https://web.archive.org/web/20140827213919/http://users.physics.harvard.edu/~wilson/pmpmta/Mahoney_extinction.pdf |url-status=dead }}</ref><ref>{{Cite journal |last1=Asher |first1=D.J. |last2=Bailey |first2=M.E. |last3=Emel'yanenko |first3=V. |last4=Napier |first4=W.M. |year=2005 |title=Earth in the cosmic shooting gallery |url=http://www.arm.ac.uk/preprints/455.pdf |journal=The Observatory |volume=125 |pages=319β322 |bibcode=2005Obs...125..319A}}</ref> Moreover, large [[supervolcano]] eruptions may cause a [[volcanic winter]] that could endanger the survival of humanity.<ref name=":11">{{Cite journal |last1=Rampino |first1=M.R. |last2=Ambrose |first2=S.H. |year=2002 |title=Super eruptions as a threat to civilizations on Earth-like planets |url=http://www.eos.ubc.ca/~mjelline/website212/rampino02.pdf |journal=Icarus |volume=156 |issue=2 |pages=562β569 |bibcode=2002Icar..156..562R |doi=10.1006/icar.2001.6808 |access-date=February 14, 2022 |archive-date=September 24, 2015 |archive-url=https://web.archive.org/web/20150924001405/http://www.eos.ubc.ca/~mjelline/website212/rampino02.pdf |url-status=dead }}</ref> The geological record suggests that supervolcanic eruptions are estimated to occur on average about once every 50,000 years, though most such eruptions would not reach the scale required to cause human extinction.<ref name=":11" /> Famously, the supervolcano [[Toba catastrophe theory|Mt. Toba may have almost wiped out humanity]] at the time of its last eruption (though this is contentious).<ref name=":11" /><ref>{{Cite journal |last1=Yost |first1=Chad L. |last2=Jackson |first2=Lily J. |last3=Stone |first3=Jeffery R. |last4=Cohen |first4=Andrew S. |date=March 1, 2018 |title=Subdecadal phytolith and charcoal records from Lake Malawi, East Africa imply minimal effects on human evolution from the ~74 ka Toba supereruption |journal=Journal of Human Evolution |volume=116 |pages=75β94 |doi=10.1016/j.jhevol.2017.11.005 |issn=0047-2484 |pmid=29477183|doi-access=free |bibcode=2018JHumE.116...75Y }}</ref> Since anthropogenic risk is a relatively recent phenomenon, humanity's track record of survival cannot provide similar assurances.<ref name=":13" /> Humanity has only survived 79 years since the creation of nuclear weapons, and for future technologies, there is no track record. This has led thinkers like [[Carl Sagan]] to conclude that humanity is currently in a "time of perils"<ref>{{Cite book |last=Sagan |first=Carl |title=[[Pale Blue Dot (book)|Pale Blue Dot]] |date=1994 |publisher=Random House |isbn=0-679-43841-6 |pages=305β6 |quote="Some planetary civilizations see their way through, place limits on what may and what must not be done, and safely pass through the time of perils. Others are not so lucky or so prudent, perish." |author-link=Carl Sagan}}</ref> β a uniquely dangerous period in human history, where it is subject to unprecedented levels of risk, beginning from when humans first started posing risk to themselves through their actions.<ref name="Ord 20202" /><ref>{{Cite book |last=Parfit |first=Derek |title=On What Matters Vol. 2 |date=2011 |publisher=Oxford University Press |isbn=9780199681044 |page=616 |quote="We live during the hinge of history ... If we act wisely in the next few centuries, humanity will survive its most dangerous and decisive period." |author-link=Derek Parfit}}</ref> Paleobiologist [[Olev Vinn]] has suggested that humans presumably have a number of inherited behavior patterns (IBPs) that are not fine-tuned for conditions prevailing in technological civilization. Indeed, some IBPs may be highly incompatible with such conditions and have a high potential to induce self-destruction. These patterns may include responses of individuals seeking power over conspecifics in relation to harvesting and consuming energy.<ref name=vinn2024>{{cite journal|last=Vinn|first=O.|date=2024|title=Potential incompatibility of inherited behavior patterns with civilization: Implications for Fermi paradox|journal=Science Progress|volume=107|issue=3|pages=1β6|doi=10.1177/00368504241272491|pmid= 39105260|s2cid= |pmc=11307330}}</ref> Nonetheless, there are ways to address the issue of inherited behavior patterns.<ref name=vinn2025>{{cite journal|last=Vinn|first=O.|date=2025|title=How to solve the problem of inherited behavior patterns and increase the sustainability of technological civilization|journal=Frontiers in Psychology|volume=16|issue= |pages=1β4|doi=10.3389/fpsyg.2025.1562943|doi-access=free |pmid= 40018008|s2cid= |pmc=11866485 }}</ref> === Risk estimates === Given the limitations of ordinary observation and modeling, [[Delphi method|expert elicitation]] is frequently used instead to obtain probability estimates.<ref name="probabilities and methodologies">{{Cite journal |last1=Rowe |first1=Thomas |last2=Beard |first2=Simon |date=2018 |title=Probabilities, methodologies and the evidence base in existential risk assessments |url=http://eprints.lse.ac.uk/89506/1/Beard_Existential-Risk-Assessments_Accepted.pdf |journal=Working Paper, Centre for the Study of Existential Risk |access-date=August 26, 2018}}</ref> * Humanity has a 95% probability of being extinct in 7,800,000 years, according to [[J. Richard Gott]]'s formulation of the controversial [[doomsday argument]], which argues that we have probably already lived through half the duration of human history.<ref>{{Cite journal |author=Gott, III |first=J. Richard |year=1993 |title=Implications of the Copernican principle for our future prospects |journal=[[Nature (journal)|Nature]] |volume=363 |issue=6427 |pages=315β319 |bibcode=1993Natur.363..315G |doi=10.1038/363315a0 |s2cid=4252750}}</ref> * In 1996, [[John A. Leslie]] estimated a 30% risk over the next five centuries (equivalent to around 6% per century, on average).{{sfn|Leslie|1996|p=146}} * In 2003, [[Martin Rees]] estimated a 50% chance of collapse of civilisation in the twenty-first century.<ref>{{Cite book |last=Rees |first=Martin |title=Our Final Century |publisher=Arrow Books |year=2004 |page=9 |orig-year=2003}}</ref> * The [[Global Challenges Foundation]]'s 2016 annual report estimates an annual probability of human extinction of at least 0.05% per year (equivalent to 5% per century, on average).<ref name="gcf_atlantic">{{Cite web |last=Meyer |first=Robinson |date=April 29, 2016 |title=Human Extinction Isn't That Unlikely |url=https://www.theatlantic.com/technology/archive/2016/04/a-human-extinction-isnt-that-unlikely/480444/ |access-date=April 30, 2016 |website=[[The Atlantic]] |publisher=Emerson Collective |location=Boston, Massachusetts}}</ref> * As of May 28, 2025, [[Metaculus]] users estimate a 0.2% probability of human extinction by 2100.<ref>{{Cite web |date=November 12, 2017 |title=Will humans become extinct by 2100? |url=https://www.metaculus.com/questions/578/human-extinction-by-2100 |access-date=May 28, 2025 |website=[[Metaculus]]}}</ref> * According to a 2020 study published in ''[[Scientific Reports]]'', if [[deforestation]] and resource [[Overconsumption|consumption]] continue at current rates, they could culminate in a "catastrophic collapse in human population" and possibly "an irreversible collapse of our civilization" in the next 20 to 40 years. According to the most optimistic scenario provided by the study, the chances that human civilization survives are smaller than 10%. To avoid this collapse, the study says, humanity should pass from a civilization dominated by the economy to a "cultural society" that "privileges the interest of the ecosystem above the individual interest of its components, but eventually in accordance with the overall communal interest."<ref>{{Cite web |last=Nafeez |first=Ahmed |title=Theoretical Physicists Say 90% Chance of Societal Collapse Within Several Decades |url=https://www.vice.com/en/article/theoretical-physicists-say-90-chance-of-societal-collapse-within-several-decades/ |access-date=August 2, 2021 |website=Vice|date=July 28, 2020 }}</ref><ref>{{Cite journal |last1=Bologna |first1=M. |last2=Aquino |first2=G. |date=2020 |title=Deforestation and world population sustainability: a quantitative analysis |journal=Scientific Reports |volume=10 |issue=7631 |page=7631 |doi=10.1038/s41598-020-63657-6 |pmc=7203172 |pmid=32376879|arxiv=2006.12202 |bibcode=2020NatSR..10.7631B }}</ref> * Nick Bostrom, a philosopher at the [[University of Oxford]] known for his work on [[existential risk]], argues ** that it would be "misguided"<ref name="Bostrom 2002">{{Citation |last=Bostrom |first=Nick |title=Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards |date=2002 |work=Journal of Evolution and Technology |volume=9 |quote=My subjective opinion is that setting this probability lower than 25% would be misguided, and the best estimate may be considerably higher.}}</ref> to assume that the probability of near-term extinction is less than 25%, and ** that it will be "a tall order" for the human race to "get our precautions sufficiently right the first time", given that an existential risk provides no opportunity to learn from failure.{{sfn|Bostrom|2013}}<ref name="matheny"/> * Philosopher [[John A. Leslie]] assigns a 70% chance of humanity surviving the next five centuries, based partly on the controversial philosophical [[doomsday argument]] that Leslie champions. Leslie's argument is somewhat [[frequentist]], based on the observation that human extinction has never been observed, but requires subjective anthropic arguments.<ref>{{Cite journal |last=Whitmire |first=Daniel P. |date=August 3, 2017 |title=Implication of our technological species being first and early |journal=International Journal of Astrobiology |volume=18 |issue=2 |pages=183β188 |doi=10.1017/S1473550417000271 |doi-access=free}}</ref> Leslie also discusses the anthropic [[survivorship bias]] (which he calls an "observational selection" effect on page 139) and states that the ''[[A priori and a posteriori|a priori]]'' certainty of observing an "undisastrous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes [[Holger Bech Nielsen]]'s formulation: "We do not even know if there should exist some extremely dangerous decay of say the proton which caused the eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe."{{sfn|Leslie|1996|p=139}} * Jean-Marc Salotti calculated the probability of human extinction caused by a giant asteroid impact.<ref name="Salotti">{{cite journal |last1=Salotti |first1=Jean-Marc |title=Human extinction by asteroid impact |journal=Futures |date=April 2022 |volume=138 |page=102933 |doi=10.1016/j.futures.2022.102933 |s2cid=247718308 |doi-access=free }}</ref> It is between 0.03 and 0.3 for the next billion years, if there is no colonization of other planets. According to that study, the most frightening object is a giant long-period comet with a warning time of a few years only and therefore no time for any intervention in space or settlement on the Moon or Mars. The probability of a giant comet impact in the next hundred years is {{val|2.2E-12}}.<ref name="Salotti" /> * As the United Nations Office for Disaster Risk Reduction estimated in 2023, there is a 2 to 14% (median: 8%){{Fix|text=Unclear what the median represents in this context. Over what distribution is it calculated?}} chance of an extinction-level event by 2100, but there was a 14 to 98% (median: 56%) chance of an extinction-level event by 2700.<ref>{{Cite web |last=Klaas |first=Brian |date=March 12, 2025 |title=DOGE Is Courting Catastrophic Risk |url=https://www.theatlantic.com/politics/archive/2025/03/doge-musk-catastrophic-risk/682011/ |access-date=March 14, 2025 |website=[[The Atlantic]]}}</ref>{{clarification needed |date=March 2025 |reason="in" or "by"?}} * Bill Gates told ''[[The Wall Street Journal]]'' in January 27, 2025 that he believes there is a 10β15% chance of a natural pandemic hitting in the next four years.<ref>{{Cite web |last=King |first=Jordan |date=March 11, 2025 |title=Americans Are Worried About Another Pandemic |url=https://www.newsweek.com/pandemic-covid-19-poll-survey-2042274 |access-date=April 14, 2025 |website=[[Newsweek]]}}</ref> * On March 19, 2025, [[Henry Gee]] said that humanity will be extinct in the next 10,000 years. To avoid it happening, he wanted all humanity to establish space colonies in the next 200-300 years.<ref>{{Cite book |last=Gee |first=Henry |title=The Decline and Fall of the Human Empire |publisher=[[Macmillan Publishers]] |year=2025 |orig-year=2025}}</ref> ====From nuclear weapons==== {{Main|Nuclear holocaust#Likelihood of complete human extinction}} In November 13, 2024, [[American Enterprise Institute]] estimated a probability of nuclear war during the 21st century between 0% to 80% (median average β 40%).<ref name="AEI">{{Cite web |last=Pielke, Jr. |first=Roger |date=November 13, 2024 |title=Global Existential Risks |url=https://www.aei.org/articles/global-existential-risks/ |access-date=December 17, 2024 |website=[[American Enterprise Institute]]}}</ref> A 2023 article of ''[[The Economist]]'' estimated an 8% chance of Nuclear War causing global catastrophe and a 0.5625% chance of Nuclear War causing human extinction.<ref name="econ">{{Cite news |date=July 10, 2023 |title=What are the chances of an AI apocalypse? |url=https://www.economist.com/science-and-technology/2023/07/10/what-are-the-chances-of-an-ai-apocalypse |access-date=July 10, 2023 |newspaper=[[The Economist]]}}</ref> ====From supervolcanic eruption==== In November 13, 2024, [[American Enterprise Institute]] estimated an annual probability of supervolcanic eruption around 0.0067% (0.67% per century on average).<ref name="AEI">{{Cite web |last=Pielke, Jr. |first=Roger |date=November 13, 2024 |title=Global Existential Risks |url=https://www.aei.org/articles/global-existential-risks/ |access-date=December 17, 2024 |website=[[American Enterprise Institute]]}}</ref> ====From artificial intelligence==== {{Main|Existential risk from artificial intelligence}} * A 2008 survey by the Future of Humanity Institute estimated a 5% probability of extinction by super-intelligence by 2100.<ref name="fhi2"/> * A 2016 survey of AI experts found a median estimate of 5% that human-level AI would cause an outcome that was "extremely bad (e.g. human extinction)".<ref>{{Cite arXiv |eprint=1705.08807 |class=cs.AI |first1=Katja |last1=Grace |first2=John |last2=Salvatier |title=When Will AI Exceed Human Performance? Evidence from AI Experts |date=May 3, 2018 |first3=Allen |last3=Dafoe |first4=Baobao |last4=Zhang |first5=Owain |last5=Evans}}</ref> In 2019, the risk was lowered to 2%, but in 2022, it was increased back to 5%. In 2023, the risk doubled to 10%. In 2024, the risk increased to 15%.<ref>{{Cite web |last=Strick |first=Katie |title=Is the AI apocalypse actually coming? What life could look like if robots take over |url=https://www.standard.co.uk/lifestyle/ai-apocalypse-life-robots-take-over-elon-musk-chatgpt-b1078423.html#:~:text=According%20to%20a%20recent%20study,religions%20or%20even%20playing%20God.|magazine=The Standard |date=May 31, 2023 |access-date=May 31, 2023}}</ref> * In 2020, [[Toby Ord]] estimates existential risk in the next century at "1 in 6" in his book ''[[The Precipice: Existential Risk and the Future of Humanity]]''.<ref name="Ord 20202" /><ref>{{Cite magazine |last=Purtill |first=Corinne |title=How Close Is Humanity to the Edge? |url=https://www.newyorker.com/culture/annals-of-inquiry/how-close-is-humanity-to-the-edge |magazine=The New Yorker |access-date=January 8, 2021}}</ref> He also estimated a "1 in 10" risk of extinction by unaligned AI within the next century. * According to the July 10, 2023 article of ''[[The Economist]]'', scientists estimated a 12% chance of AI-caused catastrophe and a 3% chance of AI-caused extinction by 2100.<ref name="econ"/> * In May 1, 2023, The Treaty on Artificial Intelligence Safety and Cooperation (TAISC) estimated a 32.2% risk of an AI-caused catastrophe by 2200. If a 6 month moratorium is implemented before 2026, these odds drop to 9.86%.<ref>{{Cite web |date=May 1, 2023 |title=A 30% Chance of AI Catastrophe: Samotsvety's Forecasts on AI Risks and the Impact of a Strong AI Treaty |url=https://taisc.org/report |access-date=May 1, 2023 |website=Treaty on Artificial Intelligence Safety and Cooperation (TAISC)}}</ref> * On December 27, 2024, [[Geoffrey Hinton]] estimated a 10-20% (median average - 15%) probability of AI-caused extinction in the next 30 years.<ref>{{Cite web |last=Milmo |first=Dan |date=December 27, 2024 |title='Godfather of AI' shortens odds of the technology wiping out humanity over next 30 years |url=https://www.theguardian.com/technology/2024/dec/27/godfather-of-ai-raises-odds-of-the-technology-wiping-out-humanity-over-next-30-years |access-date=December 27, 2024 |website=[[The Guardian]]}}</ref> He also estimated a 50-100% (median average - 75%) probability of AI-caused extinction in the next 150 years. * On May 6, 2025, [[Scientific American]] estimated a 0-10% (median average - 5%) probability of an AI-caused extinction by 2100. <ref>{{Cite web |last=Vermeer |first=Michael |date=May 6, 2025 |title=Could AI Really Kill Off Humans? |url=https://www.scientificamerican.com/article/could-ai-really-kill-off-humans/ |access-date=May 7, 2025 |website=[[Scientific American]]}}</ref> ====From climate change==== {{Main|Climate change and civilizational collapse}} [[File:Placard against human extinction, Extinction Rebellion (cropped).jpg|thumb|Placard against omnicide, at [[Extinction Rebellion]] (2018)]] In a 2010 interview with ''[[The Australian]]'', the late Australian scientist [[Frank Fenner]] predicted the extinction of the human race within a century, primarily as the result of [[human overpopulation]], [[environmental degradation]] and climate change.<ref>{{Cite web |last=Edwards |first=Lin |date=June 23, 2010 |title=Humans will be extinct in 100 years says eminent scientist |url=https://phys.org/news/2010-06-humans-extinct-years-eminent-scientist.html |access-date=January 10, 2021 |website=[[Phys.org]]}}</ref> There are several economists who have discussed the importance of global catastrophic risks. For example, [[Martin Weitzman]] argues that most of the expected economic damage from [[climate change]] may come from the small chance that warming greatly exceeds the mid-range expectations, resulting in catastrophic damage.<ref name="weitzman1">{{Cite journal |last=Weitzman |first=Martin |date=2009 |title=On modeling and interpreting the economics of catastrophic climate change |url=http://dash.harvard.edu/bitstream/handle/1/3693423/Weitzman_OnModeling.pdf |journal=The Review of Economics and Statistics |volume=91 |issue=1 |pages=1β19 |doi=10.1162/rest.91.1.1 |s2cid=216093786}}</ref> [[Richard Posner]] has argued that humanity is doing far too little, in general, about small, hard-to-estimate risks of large-scale catastrophes.<ref>{{Cite book |last=Posner |first=Richard |title=Catastrophe: Risk and Response |title-link=Catastrophe: Risk and Response |date=2004 |publisher=Oxford University Press}}</ref> ===Individual vs. species risks=== Although existential risks are less manageable by individuals than, for example, health risks, according to Ken Olum, [[Joshua Knobe]], and Alexander Vilenkin, the possibility of human extinction ''does'' have practical implications. For instance, if the "universal" [[doomsday argument]] is accepted, it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "...you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life than we believe."<ref>"Practical application", of the [[Princeton University]] paper: [http://www.princeton.edu/~jknobe/physics.pdf Philosophical Implications of Inflationary Cosmology], p. 39. {{webarchive|url=https://web.archive.org/web/20050512134626/http://www.princeton.edu/~jknobe/physics.pdf|date=May 12, 2005}}.</ref> ===Difficulty=== Some scholars argue that certain scenarios such as global [[Nuclear warfare|thermonuclear war]] would have difficulty eradicating every last settlement on Earth. Physicist Willard Wells points out that any credible extinction scenario would have to reach into a diverse set of areas, including the underground subways of major cities, the mountains of Tibet, the remotest islands of the South Pacific, and even to [[McMurdo Station]] in Antarctica, which has contingency plans and supplies for long isolation.<ref name="wells">{{cite book |last=Wells |first=Willard. |title=Apocalypse when? |publisher=Praxis |year=2009 |isbn=978-0387098364}}</ref> In addition, elaborate bunkers exist for government leaders to occupy during a nuclear war.<ref name="matheny"/> The existence of [[nuclear submarine]]s, which can stay hundreds of meters deep in the ocean for potentially years at a time, should also be considered. Any number of events could lead to a massive loss of human life, but if the last few (see [[minimum viable population]]) most resilient humans are unlikely to also die off, then that particular human extinction scenario may not seem credible.<ref>{{cite journal |last1=Tonn |first1=Bruce |first2=Donald |last2=MacGregor |ssrn=1775342 |title=A singular chain of events |journal=Futures |volume=41 |issue=10 |year=2009 |pages=706β714|doi=10.1016/j.futures.2009.07.009 |s2cid=144553194 }}</ref> == Ethics == ===Value of human life=== "Existential risks" are risks that threaten the entire future of humanity, whether by causing human extinction or by otherwise permanently crippling human progress.{{sfn|Bostrom|2013}} Multiple scholars have argued based on the size of the "cosmic endowment" that because of the inconceivably large number of potential future lives that are at stake, even small reductions of existential risk have great value. In one of the earliest discussions of ethics of human extinction, [[Derek Parfit]] offers the following thought experiment:<ref name="Parfit 1984">{{Cite book |last=Parfit |first=Derek |title=Reasons and Persons |title-link=Reasons and Persons |date=1984 |publisher=Oxford University Press |pages=453β454}}</ref> {{Cquote | quote = I believe that if we destroy mankind, as we now can, this outcome will be much worse than most people think. Compare three outcomes:<br> (1) Peace.<br> (2) A nuclear war that kills 99% of the world's existing population.<br>(3) A nuclear war that kills 100%.<br> (2) would be worse than (1), and (3) would be worse than (2). Which is the greater of these two differences? Most people believe that the greater difference is between (1) and (2). I believe that the difference between (2) and (3) is very much greater. | author = Derek Parfit | source = }} The scale of what is lost in an existential catastrophe is determined by humanity's long-term potential β what humanity could expect to achieve if it survived.<ref name="Ord 20202" /> From a [[Utilitarianism|utilitarian]] perspective, the value of protecting humanity is the product of its duration (how long humanity survives), its size (how many humans there are over time), and its quality (on average, how good is life for future people).<ref name="Ord 20202" />{{rp|273}}<ref>{{Cite web |last1=MacAskill |first1=William |author-link=William MacAskill |last2=Yetter Chappell |first2=Richard |date=2021 |title=Population Ethics {{!}} Practical Implications of Population Ethical Theories |url=https://www.utilitarianism.net/population-ethics |access-date=August 12, 2021 |website=Introduction to Utilitarianism}}</ref> On average, species survive for around a million years before going extinct. Parfit points out that the Earth will remain habitable for around a billion years.<ref name="Parfit 1984" /> And these might be lower bounds on our potential: if humanity is able to [[Space colonization|expand beyond Earth]], it could greatly increase the human population and survive for trillions of years.<ref name="waste">{{Cite journal |last=Bostrom |first=Nick |year=2009 |title=Astronomical Waste: The opportunity cost of delayed technological development |url=http://www.nickbostrom.com/astronomical/waste.html |journal=Utilitas |volume=15 |issue=3 |pages=308β314 |citeseerx=10.1.1.429.2849 |doi=10.1017/s0953820800004076 |s2cid=15860897}}</ref><ref name="Ord 20202" />{{rp|21}} The size of the foregone potential that would be lost, were humanity to become extinct, is very large. Therefore, reducing existential risk by even a small amount would have a very significant moral value.{{sfn|Bostrom|2013}}<ref>{{Cite web |last=Todd |first=Benjamin |date=2017 |title=The case for reducing existential risks |url=https://80000hours.org/articles/existential-risks/ |access-date=January 8, 2020 |website=[[80,000 Hours]]}}</ref> [[Carl Sagan]] wrote in 1983:<blockquote>If we are required to calibrate extinction in numerical terms, I would be sure to include the number of people in [[future generations]] who would not be born.... (By one calculation), the stakes are one million times greater for extinction than for the more modest nuclear wars that kill "only" hundreds of millions of people. There are many other possible measures of the potential loss β including culture and science, the evolutionary history of the planet, and the significance of the lives of all of our ancestors who contributed to the future of their descendants. Extinction is the undoing of the human enterprise.<ref>{{Cite journal |last=Sagan |first=Carl |date=1983 |title=Nuclear war and climatic catastrophe: Some policy implications |url=https://www.researchgate.net/publication/246982634 |journal=Foreign Affairs |volume=62 |issue=2 |pages=257β292 |doi=10.2307/20041818 |jstor=20041818|s2cid=151058846 }}</ref></blockquote>Philosopher [[Robert Merrihew Adams|Robert Adams]] in 1989 rejected Parfit's "impersonal" views but spoke instead of a moral imperative for loyalty and commitment to "the future of humanity as a vast project... The aspiration for a better society β more just, more rewarding, and more peaceful... our interest in the lives of our children and grandchildren, and the hopes that they will be able, in turn, to have the lives of their children and grandchildren as projects."<ref>{{Cite journal |last=Adams |first=Robert Merrihew |date=October 1989 |title=Should Ethics be More Impersonal? a Critical Notice of Derek Parfit, Reasons and Persons |journal=The Philosophical Review |volume=98 |issue=4 |pages=439β484 |doi=10.2307/2185115 |jstor=2185115}}</ref> Philosopher [[Nick Bostrom]] argues in 2013 that [[Preference utilitarianism|preference-satisfactionist]], democratic, custodial, and intuitionist arguments all converge on the common-sense view that preventing existential risk is a high moral priority, even if the exact "degree of badness" of human extinction varies between these philosophies.{{sfn|Bostrom|2013|pp=23β24}} Parfit argues that the size of the "cosmic endowment" can be calculated from the following argument: If Earth remains habitable for a billion more years and can sustainably support a population of more than a billion humans, then there is a potential for 10{{sup|16}} (or 10,000,000,000,000,000) human lives of normal duration.<ref name="parfit">Parfit, D. (1984) [[Reasons and Persons]]. Oxford, England: Clarendon Press. pp. 453β454.</ref> Bostrom goes further, stating that if the universe is empty, then the [[cosmological horizon|accessible universe]] can support at least 10{{sup|34}} biological human life-years; and, if some humans were uploaded onto computers, could even support the equivalent of 10{{sup|54}} cybernetic human life-years.{{sfn|Bostrom|2013}} Some economists and philosophers have defended views, including [[exponential discounting]] and [[Population ethics#Person-affecting views|person-affecting views of population ethics]], on which future people do not matter (or matter much less), morally speaking.<ref>{{Cite journal |last=Narveson |first=Jan |date=1973 |title=Moral Problems of Population |journal=The Monist |volume=57 |issue=1 |pages=62β86 |doi=10.5840/monist197357134 |pmid=11661014}}</ref> While these views are controversial,<ref name="matheny"/><ref>{{Cite journal |last=Greaves |first=Hilary |date=2017 |title=Discounting for Public Policy: A Survey |url=https://www.cambridge.org/core/journals/economics-and-philosophy/article/discounting-for-public-policy-a-survey/4CDDF711BF8782F262693F4549B5812E |journal=Economics & Philosophy |volume=33 |issue=3 |pages=391β439 |doi=10.1017/S0266267117000062 |issn=0266-2671 |s2cid=21730172}}</ref><ref>{{Cite journal |last=Greaves |first=Hilary |date=2017 |title=Population axiology |url=https://onlinelibrary.wiley.com/doi/abs/10.1111/phc3.12442 |journal=Philosophy Compass |volume=12 |issue=11 |pages=e12442 |doi=10.1111/phc3.12442 |issn=1747-9991|url-access=subscription }}</ref> they would agree that an existential catastrophe would be among the worst things imaginable. It would cut short the lives of eight billion presently existing people, destroying all of what makes their lives valuable, and most likely subjecting many of them to profound suffering. So even setting aside the value of future generations, there may be strong reasons to reduce existential risk, grounded in concern for presently existing people.<ref>{{Cite web |last=Lewis |first=Gregory |date=May 23, 2018 |title=The person-affecting value of existential risk reduction |website=www.gregoryjlewis.com |url=https://gregoryjlewis.com/2018/05/23/the-person-affecting-value-of-existential-risk-reduction/ |access-date=August 7, 2020}}</ref> Beyond utilitarianism, other moral perspectives lend support to the importance of reducing existential risk. An existential catastrophe would destroy more than just humanity β it would destroy all cultural artifacts, languages, and traditions, and many of the things we value.<ref name="Ord 20202" /><ref name="Sagan 1983">{{Cite magazine |last=Sagan |first=Carl |date=Winter 1983 |title=Nuclear War and Climatic Catastrophe: Some Policy Implications |url=https://www.foreignaffairs.com/articles/1983-12-01/nuclear-war-and-climatic-catastrophe-some-policy-implications |magazine=Foreign Affairs |publisher=Council on Foreign Relations |doi=10.2307/20041818 |jstor=20041818 |accessdate=August 4, 2020}}</ref> So moral viewpoints on which we have duties to protect and cherish things of value would see this as a huge loss that should be avoided.<ref name="Ord 20202" /> One can also consider reasons grounded in duties to past generations. For instance, [[Edmund Burke]] writes of a "partnership...between those who are living, those who are dead, and those who are to be born".<ref>{{Cite book |last=Burke |first=Edmund |title=Select Works of Edmund Burke Volume 2 |publisher=Liberty Fund |year=1999 |editor-last=Canavan |editor-first=Francis |page=192 |chapter=Reflections on the Revolution in France |orig-year=1790 |chapter-url=http://oll-resources.s3.amazonaws.com/titles/656/0005-02_SM.pdf}}</ref> If one takes seriously the debt humanity owes to past generations, Ord argues the best way of repaying it might be to "pay it forward", and ensure that humanity's inheritance is passed down to future generations.<ref name="Ord 20202" />{{rp|49β51}} ===Voluntary extinction=== [[File:Voluntary Human Extinction Movement - motto.jpg|thumb|[[Voluntary Human Extinction Movement]]]] Some philosophers adopt the [[Antinatalism|antinatalist]] position that human extinction would not be a bad thing, but a good thing. [[David Benatar]] argues that coming into existence is always serious harm, and therefore it is better that people do not come into existence in the future.<ref>{{Cite book |last=Benatar |first=David |url=https://archive.org/details/betternevertohav0000bena/page/28 |title=Better Never to Have Been: The Harm of Coming into Existence |date=2008 |publisher=[[Oxford University Press]] |isbn=978-0199549269 |page=[https://archive.org/details/betternevertohav0000bena/page/28 28] |quote=Being brought into existence is not a benefit but always a harm. |author-link=David Benatar}}</ref> Further, Benatar, animal rights activist [[Steven Best]], and anarchist [[Todd May (philosopher)|Todd May]], posit that human extinction would be a positive thing for the other organisms on the planet, and the planet itself, citing, for example, the omnicidal nature of human civilization.<ref>{{Cite book |last=Benatar |first=David |author-link=David Benatar |url=https://archive.org/details/betternevertohav0000bena/page/224 |title=Better Never to Have Been: The Harm of Coming into Existence |date=2008 |publisher=[[Oxford University Press]] |isbn=978-0199549269 |page=[https://archive.org/details/betternevertohav0000bena/page/224 224] |language=en |quote=Although there are many non-human species β especially carnivores β that also cause a lot of suffering, humans have the unfortunate distinction of being the most destructive and harmful species on earth. The amount of suffering in the world could be radically reduced if there were no more humans.}}</ref><ref>{{Cite book |last=Best |first=Steven |title=The Politics of Total Liberation: Revolution for the 21st Century|chapter=Conclusion: Reflections on Activism and Hope in a Dying World and Suicidal Culture|date=2014 |publisher=[[Palgrave Macmillan]] |isbn=978-1137471116|doi=10.1057/9781137440723_7|page=165 |quote=In an era of catastrophe and crisis, the continuation of the human species in a viable or desirable form, is obviously contingent and ''not a given or necessary good''. But considered from ''the standpoint of animals and the earth'', the demise of humanity would be the best imaginable event possible, and the sooner the better. The extinction of Homo sapiens would remove the malignancy ravaging the planet, destroy a parasite consuming its host, shut down the killing machines, and allow the earth to regenerate while permitting new species to evolve. |author-link=Steven Best}}</ref><ref>{{Cite news |last=May |first=Todd |author-link=Todd May (philosopher) |date=December 17, 2018 |title=Would Human Extinction Be a Tragedy? |url=https://www.nytimes.com/2018/12/17/opinion/human-extinction-climate-change.html |work=[[The New York Times]] |quote=Human beings are destroying large parts of the inhabitable earth and causing unimaginable suffering to many of the animals that inhabit it. This is happening through at least three means. First, human contribution to climate change is devastating ecosystems ... Second, the increasing human population is encroaching on ecosystems that would otherwise be intact. Third, factory farming fosters the creation of millions upon millions of animals for whom it offers nothing but suffering and misery before slaughtering them in often barbaric ways. There is no reason to think that those practices are going to diminish any time soon. Quite the opposite.}}</ref> The environmental view in favor of human extinction is shared by the members of [[Voluntary Human Extinction Movement]] and the [[Church of Euthanasia]] who call for refraining from reproduction and allowing the human species to go peacefully extinct, thus stopping further [[environmental degradation]].<ref>{{Cite book |last=MacCormack |first=Patricia |title=The Ahuman Manifesto: Activism for the End of the Anthropocene |date=2020 |publisher=[[Bloomsbury Publishing|Bloomsbury Academic]] |isbn=978-1350081093 |pages=143, 166 |author-link=Patricia MacCormack}}</ref> == In fiction == <!-- This section should remain brief and well-sourced; more detailed examples should go in "Apocalyptic and post-apocalyptic fiction" article. --> {{Main|Apocalyptic and post-apocalyptic fiction}} [[Jean-Baptiste Cousin de Grainville]]'s 1805 [[science fantasy]] novel ''Le dernier homme'' (''The Last Man''), which depicts human extinction due to infertility, is considered the first modern apocalyptic novel and credited with launching the genre.<ref name="Wagar">{{Cite journal |last=Wagar |first=W. Warren |date=2003 |title=Review of The Last Man, Jean-Baptiste FranΓ§ois Xavier Cousin de Grainville |url=https://www.jstor.org/stable/20718566 |journal=[[Utopian Studies]] |volume=14 |issue=1 |pages=178β180 |issn=1045-991X |jstor=20718566}}</ref> Other notable early works include [[Mary Shelley]]'s 1826 ''[[The Last Man (Mary Shelley novel)|The Last Man]]'', depicting human extinction caused by a [[pandemic]], and [[Olaf Stapledon]]'s 1937 ''[[Star Maker]]'', "a comparative study of omnicide".{{r|Moynihan}} Some 21st century pop-science works, including ''[[The World Without Us]]'' by [[Alan Weisman]], and the television specials ''[[Life After People]]'' and ''[[Aftermath: Population Zero]]'' pose a [[thought experiment]]: what would happen to the rest of the planet if humans suddenly disappeared?<ref>{{Cite news |date=August 18, 2007 |title=He imagines a world without people. But why? |work=[[The Boston Globe]] |url=http://archive.boston.com/news/globe/living/articles/2007/08/18/he_imagines_a_world_without_people_but_why/ |access-date=July 20, 2016}}</ref><ref>{{Cite news |last=Tucker |first=Neely |date=March 8, 2008 |title=Depopulation Boom |newspaper=[[The Washington Post]] |url=https://www.washingtonpost.com/wp-dyn/content/article/2008/03/07/AR2008030703256.html?hpid=artslot |access-date=July 20, 2016}}</ref> A threat of human extinction, such as through a [[technological singularity]] (also called an intelligence explosion), drives the plot of innumerable science fiction stories; an influential early example is the 1951 film adaption of ''[[When Worlds Collide]]''.<ref>{{Cite book |last=Barcella |first=Laura |title=The end: 50 apocalyptic visions from pop culture that you should know about β before it's too late |date=2012 |publisher=Zest Books |isbn=978-0982732250 |location=San Francisco, California |language=en}}</ref> Usually the extinction threat is narrowly avoided, but some exceptions exist, such as ''[[R.U.R.]]'' and [[Steven Spielberg]]'s ''[[A.I. Artificial Intelligence|A.I.]]''<ref>{{Cite book |last=Dinello |first=Daniel |title=Technophobia!: science fiction visions of posthuman technology |date=2005 |publisher=University of Texas press |isbn=978-0-292-70986-7 |edition=1st |location=Austin, Texas |language=en-us}}</ref> {{page needed|date=May 2024}} ==See also== {{cmn|colwidth=22em| * [[Apocalypticism]] * [[Societal collapse]] * [[Eschatology]] * [[Extinction event]] * [[Global catastrophic risk]] * [[Great Filter]] * [[Holocene extinction]] * [[Speculative evolution]] * [[Voluntary Human Extinction Movement]] * [[World War III]] }} == References == {{reflist|refs= <ref name="Ord 20202">{{Cite book |last=Ord |first=Toby |title=The Precipice: Existential Risk and the Future of Humanity |date=2020 |publisher=Hachette |isbn=9780316484916 |location=New York}}</ref> <ref name="Moynihan">{{Cite news |last=Moynihan |first=Thomas |date=September 23, 2020 |title=How Humanity Came To Contemplate Its Possible Extinction: A Timeline |work=[[MIT Press|The MIT Press Reader]] |url=https://thereader.mitpress.mit.edu/how-humanity-discovered-its-possible-extinction-timeline/ |access-date=October 11, 2020}}<br />See also: * {{Cite journal |last=Moynihan |first=Thomas |date=February 2020 |title=Existential risk and human extinction: An intellectual history |journal=[[Futures (journal)|Futures]] |volume=116 |pages=102495 |doi=10.1016/j.futures.2019.102495 |s2cid=213388167 |issn=0016-3287}} * {{Cite book |last=Moynihan |first=Thomas |title=X-Risk: How Humanity Discovered Its Own Extinction |date=2020 |publisher=[[MIT Press]] |isbn=978-1-913029-82-1 |url=https://books.google.com/books?id=7M_tDwAAQBAJ}}</ref> }} == Sources == {{refbegin}} * {{Cite journal |last=Bostrom |first=Nick |author-link=Nick Bostrom |date=2002 |title=Existential risks: analyzing human extinction scenarios and related hazards |url=https://ora.ox.ac.uk/objects/uuid:827452c3-fcba-41b8-86b0-407293e6617c |journal=[[Journal of Evolution and Technology]] |volume=9 |issn=1541-0099}} * {{Cite book |last1=Bostrom |first1=Nick |url=https://books.google.com/books?id=sTkfAQAAQBAJ |title=Global Catastrophic Risks |last2=Cirkovic |first2=Milan M. |date=September 29, 2011 |publisher=[[Oxford University Press]] |isbn=9780199606504 |editor-last=Bostrom |editor-first=Nick |editor-link=Nick Bostrom |pages=1β30 |chapter=1: Introduction |oclc=740989645 |author-link=Nick Bostrom |author-link2=Milan M. ΔirkoviΔ |editor-last2=Cirkovic |editor-first2=Milan M. |editor-link2=Milan M. ΔirkoviΔ |chapter-url=https://books.google.com/books?id=sTkfAQAAQBAJ&pg=PA1 |orig-date=Orig. July 3, 2008}} ** {{harvc |last=Rampino |first=Michael R. |c=10: Super-volcanism and other geophysical processes of catastrophic import |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA205 |pages=205β221 |author-link=Michael R. Rampino |in1=Bostrom|in2=Cirkovic|year=2011}} ** {{harvc |last=Napier |first=William |c=11: Hazards from comets and asteroids |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA222 |pages=222β237 |author-link=William Napier (astronomer) |in1=Bostrom|in2=Cirkovic|year=2011}} ** {{harvc |last=Dar |first=Arnon |c=12: Influence of Supernovae, gamma-ray bursts, solar flares, and cosmic rays on the terrestrial environment |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA238 |pages=238β262 |in1=Bostrom|in2=Cirkovic|year=2011}} ** {{harvc |last1=Frame |first1=David |c=13: Climate change and global risk |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA265 |pages=265β286 |last2=Allen |first2=Myles R. |author-link2=Myles Allen |in1=Bostrom|in2=Cirkovic|year=2011}} ** {{harvc |last=Kilbourne |first=Edwin Dennis |c=14: Plagues and pandemics: past, present, and future |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA287 |pages=287β304 |author-link=Edwin D. Kilbourne |in1=Bostrom|in2=Cirkovic|year=2011}} ** {{harvc |last=Yudkowsky |first=Eliezer |c=15: Artificial Intelligence as a positive and negative factor in global risk |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA308 |pages=308β345 |author-link=Eliezer Yudkowsky |in1=Bostrom|in2=Cirkovic|year=2011}} ** {{harvc |last=Wilczek |first=Frank |c=16: Big troubles, imagined and real |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA346 |pages=346β362 |author-link=Frank Wilczek |in1=Bostrom|in2=Cirkovic|year=2011}} ** {{harvc |last=Cirincione |first=Joseph |c=18: The continuing threat of nuclear war |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA381 |pages=381β401 |author-link=Joseph Cirincione |in1=Bostrom|in2=Cirkovic|year=2011}} ** {{harvc |last1=Ackerman |first1=Gary |c=19: Catastrophic nuclear terrorism: a preventable peril |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA402 |pages=402β449 |last2=Potter |first2=William C. |author-link=Gary Ackerman |author-link2=William C. Potter |in1=Bostrom|in2=Cirkovic|year=2011}} ** {{harvc |last1=Nouri |first1=Ali |c=20: Biotechnology and biosecurity |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA450 |pages=450β480 |last2=Chyba |first2=Christopher F. |author-link2=Christopher Chyba |in1=Bostrom|in2=Cirkovic|year=2011}} ** {{harvc |last1=Phoenix |first1=Chris |c=21: Nanotechnology as global catastrophic risk |url=https://www.google.com/books/edition/Global_Catastrophic_Risks/sTkfAQAAQBAJ?gbpv=1&pg=PA481 |pages=481β503 |last2=Treder |first2=Mike |author-link=Chris Phoenix (nanotechnologist) |in1=Bostrom|in2=Cirkovic|year=2011}} * {{Cite journal |last=Bostrom |first=Nick |author-link=Nick Bostrom |date=2013 |title=Existential Risk Prevention as Global Priority |url=https://www.existential-risk.org/concept.html |journal=[[Global Policy]] |volume=4 |issue=1 |pages=15β31 |doi=10.1111/1758-5899.12002 |issn=1758-5899|url-access=subscription }} [ [http://www.existential-risk.org/concept.pdf PDF] ] * {{Cite book |last=Leslie |first=John |year=1996 |title=The End of the World: The Science and Ethics of Human Extinction |publisher=[[Routledge]] |isbn=978-0415140430 |oclc=1158823437 |author-link=John A. Leslie |url=https://books.google.com/books?id=gUXgpH6nizIC}} * {{Cite book |last=Posner |first=Richard A. |url=https://books.google.com/books?id=SDe59lXSrY8C |title=Catastrophe: Risk and Response |date=November 11, 2004 |publisher=[[Oxford University Press]] |isbn=978-0-19-534639-8 |oclc=224729961 |author-link=Richard Posner}} * {{Cite book |last=Rees |first=Martin J. |url=https://books.google.com/books?id=GqvgCDPFZ18C |title=Our Final Hour: A Scientist's Warning : how Terror, Error, and Environmental Disaster Threaten Humankind's Future in this Century--on Earth and Beyond |date=March 19, 2003 |publisher=[[Basic Books]] |isbn=978-0-465-06862-3 |oclc=51315429 |author-link=Martin Rees}} {{refend}} == Further reading == {{refbegin}} * {{Cite book |last=Boulter |first=Michael |title=Extinction: Evolution and the End of Man |date=2005 |publisher=[[Columbia University Press]] |isbn=978-0231128377 |author-link=Michael Boulter}} * [[Christopher de Bellaigue|de Bellaigue, Christopher]], "A World Off the Hinges" (review of [[Peter Frankopan]], ''The Earth Transformed: An Untold History'', Knopf, 2023, 695 pp.), ''[[The New York Review of Books]]'', vol. LXX, no. 18 (23 November 2023), pp. 40β42. De Bellaigue writes: "Like the [[Maya civilization|Maya]] and the [[Akkadian Empire|Akkad]]ians we have learned that a broken [[natural environment|environment]] aggravates [[politics|political]] and [[economics|economic]] dysfunction and that the inverse is also true. Like the [[Qing]] we rue the deterioration of our [[soil]]s. But the lesson is never learned. [...] [[Denialism]] [...] is one of the most fundamental of human traits and helps explain our current inability to come up with a response commensurate with the perils we face." (p. 41.) * [[Marshall Brain|Brain, Marshall]] (2020) ''The Doomsday Book: The Science Behind Humanity's Greatest Threats'' Union Square {{ISBN|9781454939962}} * [[Jim Holt (philosopher)|Holt, Jim]], "The Power of Catastrophic Thinking" (review of [[Toby Ord]], ''The Precipice: Existential Risk and the Future of Humanity'', Hachette, 2020, 468 pp.), ''[[The New York Review of Books]]'', vol. LXVIII, no. 3 (February 25, 2021), pp. 26β29. [[Jim Holt (philosopher)|Jim Holt]] writes (p. 28): "Whether you are searching for a cure for cancer, or pursuing a scholarly or artistic career, or engaged in establishing more just institutions, a threat to the future of humanity is also a threat to the significance of what you do." * {{cite journal |last1=MacCormack |first1=Patricia |title=Embracing Death, Opening the World |journal=[[Australian Feminist Studies]] |date=2020 |volume=35 |issue=104 |pages=101β115 |doi=10.1080/08164649.2020.1791689 |s2cid=221790005 |url=https://arro.anglia.ac.uk/id/eprint/705303/3/MacCormack_2020.docx |author-link=Patricia MacCormack |access-date=February 20, 2023 |archive-date=April 5, 2023 |archive-url=https://web.archive.org/web/20230405170411/https://arro.anglia.ac.uk/id/eprint/705303/3/MacCormack_2020.docx |url-status=dead |url-access=subscription }} * {{Cite news |last=Michael Moyer |date=September 2010 |title=Eternal Fascinations with the End: Why We're Suckers for Stories of Our Own Demise: Our pattern-seeking brains and desire to be special help explain our fears of the apocalypse |work=Scientific American |url=https://www.scientificamerican.com/article.cfm?id=eternal-fascinations }} * [[Philip Plait|Plait, Philip]] (2008) ''Death from the Skies!: These Are the Ways the World Will End'' Viking ISBN 9780670019977 * {{cite journal |last1=Schubert|first1=Stefan |last2=Caviola|first2=Lucius |last3=Faber|first3=Nadira S. |date=2019 |title=The Psychology of Existential Risk: Moral Judgments about Human Extinction|url= |journal=[[Scientific Reports]]|volume=9 |issue= 1|page=15100 |doi=10.1038/s41598-019-50145-9|pmid=31636277 |pmc=6803761 |bibcode=2019NatSR...915100S |access-date=}} *[[Toby Ord|Ord, Toby]] (2020). ''[[The Precipice: Existential Risk and the Future of Humanity]].'' Bloomsbury Publishing. {{ISBN|1526600218}} * Torres, Phil. (2017). ''Morality, Foresight, and Human Flourishing: An Introduction to Existential Risks''. Pitchstone Publishing. {{ISBN|978-1634311427}}. * [[Michel Weber]], "Book Review: ''Walking Away from Empire''", ''Cosmos and History: The Journal of Natural and Social Philosophy'', vol. 10, no. 2, 2014, pp. 329β336. * [https://www.history.com/shows/doomsday-10-ways-the-world-will-end ''Doomsday: 10 Ways the World Will End''] (2016) [[History Channel]] * [https://www.livescience.com/earth-without-people.html What would happen to Earth if humans went extinct?] [[Live Science]], August 16, 2020. * [https://www.cnbc.com/2023/05/31/ai-poses-human-extinction-risk-sam-altman-and-other-tech-leaders-warn.html A.I. poses human extinction risk on par with nuclear war, Sam Altman and other tech leaders warn]. [[CNBC]]. May 31, 2023. * "Treading Thin Air: Geoff Mann on Uncertainty and Climate Change", ''[[London Review of Books]]'', vol. 45, no. 17 (7 September 2023), pp. 17β19. "[W]e are in desperate need of a [[politics]] that looks [the] catastrophic [[uncertainty]] [of [[global warming]] and [[climate change]]] square in the face. That would mean taking much bigger and more transformative steps: all but eliminating [[fossil fuels]]... and prioritizing [[democracy|democratic]] institutions over markets. The burden of this effort must fall almost entirely on the richest people and richest parts of the world, because it is they who continue to gamble with everyone else's fate." (p. 19.) {{refend}} {{Doomsday}} {{Extinction}} [[Category:Human extinction| ]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Citation
(
edit
)
Template:Cite arXiv
(
edit
)
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite magazine
(
edit
)
Template:Cite news
(
edit
)
Template:Cite web
(
edit
)
Template:Clarification needed
(
edit
)
Template:Clarify
(
edit
)
Template:Cmn
(
edit
)
Template:Cquote
(
edit
)
Template:Doomsday
(
edit
)
Template:Extinction
(
edit
)
Template:Fix
(
edit
)
Template:Harvc
(
edit
)
Template:Hatnote
(
edit
)
Template:ISBN
(
edit
)
Template:Main
(
edit
)
Template:Page needed
(
edit
)
Template:R
(
edit
)
Template:Redirect
(
edit
)
Template:Refbegin
(
edit
)
Template:Refend
(
edit
)
Template:Reflist
(
edit
)
Template:Rp
(
edit
)
Template:Sfn
(
edit
)
Template:Short description
(
edit
)
Template:Sup
(
edit
)
Template:Use American English
(
edit
)
Template:Use mdy dates
(
edit
)
Template:Val
(
edit
)
Template:Webarchive
(
edit
)