Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Disinformation
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Deliberately misleading information}} {{Distinguish|Misinformation}} {{Other uses|Disinformation (disambiguation)}} {{Use dmy dates|date=June 2017}} {{War}} '''Disinformation''' is misleading<!-- Disinformation does not always use false information; it often contains truths, half-truths, and selective truths to persuade. --> content deliberately spread to [[deceive]] people,<ref name=bittman1985 /><ref name=shultzgodson /> or to secure economic or political gain and which may cause public harm.<ref>{{Cite web |last=European Commission |date=2022-06-16 |title=The Strengthened Code of Practice on Disinformation 2022 |url=https://digital-strategy.ec.europa.eu/en/library/2022-strengthened-code-practice-disinformation |access-date=2024-11-25 |website=digital-strategy.ec.europa.eu |page=1 |language=en}}</ref> Disinformation is an orchestrated adversarial activity in which actors employ [[strategic deception]]s and [[media manipulation]] tactics to advance political, military, or commercial goals.<ref name=":5">{{Cite journal |last=Diaz Ruiz |first=Carlos |date=2025 |title=Disinformation on digital media platforms: A market-shaping approach |journal=New Media & Society |volume=27 |issue=4 |pages=2188-2211 |doi=10.1177/14614448231207644 |s2cid=264816011 |doi-access=free}}{{Creative Commons text attribution notice|cc=by4|from this source=yes}}</ref> Disinformation is implemented through coordinated campaigns<ref>{{Cite book |last=Diaz Ruiz |first=Carlos |url=https://www.taylorfrancis.com/books/9781003506676 |title=Market-Oriented Disinformation Research: Digital Advertising, Disinformation and Fake News on Social Media |date=2025-03-14 |publisher=Routledge |isbn=978-1-003-50667-6 |edition=1 |location=London |pages=29 |language=en |doi=10.4324/9781003506676-2}}</ref> that "weaponize multiple [[Rhetorical Strategies|rhetorical strategies]] and forms of knowing—including not only falsehoods but also [[truth]]s, [[half-truth]]s, and [[Value judgment|value judgements]]—to exploit and amplify [[culture war]]s and other identity-driven controversies."<ref name="DiazRuizNilsson2022" /> In contrast, ''[[misinformation]]'' refers to inaccuracies that stem from inadvertent error.<ref>{{Cite book|url=https://unesdoc.unesco.org/ark:/48223/pf0000265552 |last=Ireton|first= C|last2= Posetti|first2= J|year=2018|title= Journalism, fake news & disinformation: handbook for journalism education and training |publisher= UNESCO|access-date=7 August 2021 |archive-date=6 April 2023 |archive-url=https://web.archive.org/web/20230406163611/https://unesdoc.unesco.org/ark:/48223/pf0000265552 |url-status=live|isbn = 978-92-3-100281-6}}</ref> Misinformation can be used to create disinformation when known misinformation is purposefully and intentionally disseminated.<ref name="golbeck">{{citation |title=Computing with Social Trust |pages=19–20 |year=2008 |editor1-last=Golbeck |editor1-first=Jennifer |series=Human-Computer Interaction Series |publisher=Springer |isbn=978-1-84800-355-2}}</ref> "[[Fake news]]" has sometimes been categorized as a type of disinformation, but scholars have advised not using these two terms interchangeably or using "fake news" altogether in academic writing since politicians have weaponized it to describe any unfavorable [[news coverage]] or information.<ref>{{Cite journal |last1=Freelon |first1=Deen |last2=Wells |first2=Chris |date=2020-03-03 |title=Disinformation as Political Communication |url=https://www.tandfonline.com/doi/full/10.1080/10584609.2020.1723755 |journal=Political Communication |language=en |volume=37 |issue=2 |pages=145–156 |doi=10.1080/10584609.2020.1723755 |issn=1058-4609 |s2cid=212897113 |access-date=17 July 2023 |archive-date=17 July 2023 |archive-url=https://web.archive.org/web/20230717173304/https://www.tandfonline.com/doi/full/10.1080/10584609.2020.1723755 |url-status=live |url-access=subscription }}</ref> ==Etymology== [[File:The Etymology of Disinformation.png|thumb|The Etymology of Disinformation by H. Newman as published in The Journal of Information Warfare in 2021.<ref>{{Cite journal |last=Newman |first=Hadley |date=2022 |title=Information Warfare: Leveraging the DMMI Matrix Cube for Risk Assessment |url=https://www.jstor.org/stable/27199985 |journal=Journal of Information Warfare |volume=21 |issue=3 |pages=84–102 |jstor=27199985 |issn=1445-3312}}</ref><ref name=":3">{{Cite web |last=Hadley |first=Newman |date=2022 |title=Author |url=https://www.jinfowar.com/authors/hadley-newman |url-status=live |archive-url=https://web.archive.org/web/20221228134920/https://www.jinfowar.com/authors/hadley-newman |archive-date=28 December 2022 |access-date=28 December 2022 |website=Journal of Information Warfare |quote=Strategic communications advisor working across a broad range of policy areas for public and multilateral organisations. Counter-disinformation specialist and published author on foreign information manipulation and interference (FIMI).}}</ref> Elements of the word disinformation have their origins in [[Proto-Indo-European language|Proto-Indo-European]] language family. The Latin 'dis' and 'in' and can both be considered to have Proto-Indo-European roots, 'forma' is considerably more obscure. The green box in the figure highlights the origin 'forma' is uncertain, however, it may have its roots in the [[Aristotelianism|Aristotelean]] concept of μορφή (morphe) where something becomes a 'thing' when it has 'form' or substance.]] The English word [[wikt:disinformation|''disinformation'']] comes from the application of the Latin prefix [[wikt:dis-|''dis-'']] to [[wikt:information|''information'']] making the meaning "reversal or removal of information". The rarely used word had appeared with this usage in print at least as far back as 1887.<ref>{{Cite news |date=1887-02-17 |title=City & County Cullings (Early use of the word "disinformation" 1887) |pages=3 |work=Medicine Lodge Cresset |url=https://www.newspapers.com/clip/7726932/early-use-of-the-word-disinformation/ |access-date=2021-05-24 |archive-date=24 May 2021 |archive-url=https://web.archive.org/web/20210524135425/https://www.newspapers.com/clip/7726932/early-use-of-the-word-disinformation/ |url-status=live }}</ref><ref>{{Cite news |date=1892-08-18 |title=Professor Young on Mars and disinformation (1892) |pages=4 |work=The Salt Lake Herald |url=https://www.newspapers.com/clip/7729235/professor-young-on-mars-and/ |access-date=2021-05-24 |archive-date=24 May 2021 |archive-url=https://web.archive.org/web/20210524135429/https://www.newspapers.com/clip/7729235/professor-young-on-mars-and/ |url-status=live }}</ref><ref>{{Cite news |date=1907-09-26 |title=Pure nonsense (early use of the word disinformation) (1907) |pages=8 |work=The San Bernardino County Sun |url=https://www.newspapers.com/clip/7729323/pure-nonsense-early-use-of-the-word/ |access-date=2021-05-24 |archive-date=24 May 2021 |archive-url=https://web.archive.org/web/20210524135452/https://www.newspapers.com/clip/7729323/pure-nonsense-early-use-of-the-word/ |url-status=live }}</ref><ref>{{Cite news |date=1917-12-18 |title=Support for Red Cross helps U.S. boys abroad, Rotary Club is told (1917) |pages=4 |work=The Sheboygan Press |url=https://www.newspapers.com/clip/7818737/support-for-red-cross-helps-us-boys/ |access-date=2021-05-24 |archive-date=24 May 2021 |archive-url=https://web.archive.org/web/20210524135431/https://www.newspapers.com/clip/7818737/support-for-red-cross-helps-us-boys/ |url-status=live }}</ref> Some consider it a [[loan translation]] of the Russian {{lang|ru|дезинформация}}, [[Romanization of Russian|transliterated]] as [[wikt:дезинформация|''dezinformatsiya'']],<ref name="pacepa">{{citation |author=[[Ion Mihai Pacepa]] and [[Ronald J. Rychlak]] |title=Disinformation: Former Spy Chief Reveals Secret Strategies for Undermining Freedom, Attacking Religion, and Promoting Terrorism |title-link=Disinformation (book) |pages=4–6, 34–39, 75 |year=2013 |publisher=WND Books |isbn=978-1-936488-60-5}}</ref><ref name="bittman1985" /><ref name="shultzgodson" /> apparently derived from the title of a KGB [[black propaganda]] department.<ref name=jowett>{{citation|author1=Garth Jowett |author2=[[Victoria O'Donnell]]|title=Propaganda and Persuasion|pages=21–23|publisher=Sage Publications|isbn=978-1-4129-0898-6|year=2005|chapter=What Is Propaganda, and How Does It Differ From Persuasion?|quote=In fact, the word disinformation is a cognate for the Russian dezinformatsia, taken from the name of a division of the [[KGB]] devoted to black propaganda.}}</ref><ref name="bittman1985">{{citation|first=Ladislav|last=Bittman|author-link=Lawrence Martin-Bittman|title=The KGB and Soviet Disinformation: An Insider's View|year=1985|isbn=978-0-08-031572-0|publisher=Pergamon-Brassey's|pages=49–50|title-link=The KGB and Soviet Disinformation}}</ref><ref name="adamtaylor">{{citation|url=https://www.washingtonpost.com/news/worldviews/wp/2016/11/26/before-fake-news-there-was-soviet-disinformation|newspaper=[[The Washington Post]]|date=26 November 2016|access-date=3 December 2016|title=Before 'fake news,' there was Soviet 'disinformation'|first=Adam|last=Taylor|archive-date=14 May 2019|archive-url=https://web.archive.org/web/20190514041408/https://www.washingtonpost.com/news/worldviews/wp/2016/11/26/before-fake-news-there-was-soviet-disinformation/|url-status=live}}</ref><ref name="pacepa" /> Soviet planners in the 1950s defined disinformation as "dissemination (in the press, on the radio, etc.) of false reports intended to mislead [[public opinion]]."<ref name="ned">{{citation|first=Dean|last=Jackson|title=Distinguishing Disinformation from Propaganda, Misinformation, and 'Fake News' |year=2018|publisher=[[National Endowment for Democracy]]|url=https://www.ned.org/wp-content/uploads/2018/06/Distinguishing-Disinformation-from-Propaganda.pdf|access-date=31 May 2022|archive-date=7 April 2022|archive-url=https://web.archive.org/web/20220407003326/https://www.ned.org/wp-content/uploads/2018/06/Distinguishing-Disinformation-from-Propaganda.pdf|url-status=live}}</ref> ''Disinformation'' first made an appearance in dictionaries in 1985, specifically, ''Webster's New College Dictionary'' and the ''American Heritage Dictionary''.<ref name= bittman1988>{{citation|title=The New Image-Makers: Soviet Propaganda & Disinformation Today|first=Ladislav|last=Bittman|author-link=Lawrence Martin-Bittman|year=1988|pages=7, 24|isbn=978-0-08-034939-8|publisher=Brassey's Inc}}</ref> In 1986, the term ''disinformation'' was not defined in ''Webster's New World Thesaurus'' or ''New Encyclopædia Britannica''.<ref name=pacepa /> After the Soviet term became widely known in the 1980s, native speakers of English broadened the term as "any government communication (either overt or covert) containing intentionally false and misleading material, often combined selectively with true information, which seeks to mislead and [[Psychological manipulation|manipulate]] either elites or a [[mass audience]]."<ref name=shultzgodson>{{citation|first1=Richard H.|last1=Shultz|first2=Roy|last2=Godson|author1-link=Richard H. Shultz|author2-link=Roy Godson|title=Dezinformatsia: Active Measures in Soviet Strategy|publisher=Pergamon-Brassey's|year=1984|isbn=978-0-08-031573-7|pages=[https://archive.org/details/dezinformatsiaac0000shul/page/37 37–38]|title-link=Dezinformatsia (book)}}</ref> By 1990, use of the term ''disinformation'' had fully established itself in the English language within the lexicon of politics.<ref name=davidmartin>{{citation|title=The Web of Disinformation: Churchill's Yugoslav Blunder|first=David|last=Martin|page=[https://archive.org/details/webofdisinformat0000mart/page/ xx]|publisher=Harcourt Brace Jovanovich|year=1990|isbn=978-0-15-180704-8|url=https://archive.org/details/webofdisinformat0000mart/page/}}</ref> By 2001, the term ''disinformation'' had come to be known as simply a more civil phrase for saying someone was [[lie|lying]].<ref name=heinemann>{{citation|page=124|title=Developing Media Skills|first=Geoff |last=Barton |publisher=Heinemann|isbn=978-0-435-10960-8|year=2001}}</ref> Stanley B. Cunningham wrote in his 2002 book ''The Idea of Propaganda'' that ''disinformation'' had become pervasively used as a synonym for [[propaganda]].<ref name=cunningham>{{citation|title=The Idea of Propaganda: A Reconstruction|first=Stanley B. |last=Cunningham|pages=67–68, 110|chapter=Disinformation (Russian: ''dezinformatsiya'')|year=2002|isbn=978-0-275-97445-9|publisher=Praeger}}</ref> ==Operationalization== The [[Shorenstein Center on Media, Politics and Public Policy|Shorenstein Center]] at Harvard University defines disinformation research as an academic field that studies "the spread and impacts of misinformation, disinformation, and media manipulation", including "how it spreads through online and offline channels, and why people are susceptible to believing bad information, and successful strategies for mitigating its impact".<ref>{{Cite web |title=Disinformation |url=https://shorensteincenter.org/research-initiatives/disinformation/ |access-date=2023-10-30 |website=Shorenstein Center |language=en-US |archive-date=30 October 2023 |archive-url=https://web.archive.org/web/20231030110857/https://shorensteincenter.org/research-initiatives/disinformation/ |url-status=live }}</ref> According to a 2023 research article published in [[New Media & Society]],<ref name=":5" /> disinformation circulates on [[social media]] through deception campaigns implemented in multiple ways including: [[astroturfing]], [[Conspiracy theory|conspiracy theories]], [[clickbait]], [[culture war]]s, [[Echo chamber (media)|echo chambers]], hoaxes, [[fake news]], [[propaganda]], [[pseudoscience]], and [[rumor]]s. {| class="wikitable" ! colspan="4" |Activities that operationalize disinformation campaigns online<ref name=":5" /> |- !Term !Description !Term !Description |- |[[Astroturfing]] |A centrally coordinated campaign that mimics grassroots activism by making participants pretend to be ordinary citizens |[[Fake news]] |Genre: The deliberate creation of pseudo-journalism Label: The instrumentalization of the term to delegitimize news media |- |[[Conspiracy theory|Conspiracy theories]] |Rebuttals of official accounts that propose alternative explanations in which individuals or groups act in secret |[[Greenwashing]] |Deceptive communication makes people believe that a company is environmentally responsible when it is not |- |[[Clickbait]] |The deliberate use of misleading headlines and thumbnails to increase online traffic for profit or popularity |[[Propaganda]] |Organized mass communication, on a hidden agenda, and with a mission to conform belief and action by circumventing individual reasoning |- |[[Culture war]]s |A phenomenon in which multiple groups of people, who hold entrenched values, attempt to steer public policy contentiously |[[Pseudoscience]] |Accounts that claim the explanatory power of science, borrow its language and legitimacy but diverge substantially from its quality criteria |- |[[Doxing|Doxxing]] |A form of online harassment that breaches privacy boundaries by releasing information intending physical and online harm to a target |[[Rumor]]s |Unsubstantiated news stories that circulate while not corroborated or validated |- |[[Echo chamber (media)|Echo chamber]] |An epistemic environment in which participants encounter beliefs and opinions that coincide with their own |[[Troll (slang)|Trolling]] |Networked groups of digital influencers that operate 'click armies' designed to mobilize public sentiment |- |[[Hoax]] |News in which false facts are presented as legitimate |[[Urban legend]]s |Moral tales featuring durable stories of intruders incurring boundary transgressions and their dire consequences |- | colspan="4" |Note: This is an adaptation of Table 2 from [https://doi.org/10.1177/14614448231207644 Disinformation on Digital Media Platforms: A Market Shaping Approach], by Carlos Diaz Ruiz, used under [http://creativecommons.org/licenses/by/4.0/ CC BY 4.0] / Adapted from the original. |} In order to distinguish between similar terms, including misinformation and malinformation, scholars collectively agree on the definitions for each term as follows: (1) disinformation is the strategic dissemination of false information with the intention to cause public harm;<ref>Center for Internet Security. (3 October 2022). "Essential Guide to Election Security:Managing Mis-, Dis-, and Malinformation". [https://essentialguide.docs.cisecurity.org/en/latest/bp/mdm_info.html CIS website] {{Webarchive|url=https://web.archive.org/web/20231218162438/https://essentialguide.docs.cisecurity.org/en/latest/bp/mdm_info.html |date=18 December 2023 }} Retrieved 18 December 2023.</ref> (2) [[misinformation]] represents the unintentional spread of false information; and (3) [[malinformation]] is factual information disseminated with the intention to cause harm,<ref>{{Cite journal |last1=Baines |first1=Darrin |last2=Elliott |first2=Robert J. R. |date=April 2020 |title=Defining misinformation, disinformation and malinformation: An urgent need for clarity during the COVID-19 infodemic |url=https://ideas.repec.org//p/bir/birmec/20-06.html |journal=Discussion Papers |language=en |access-date=14 December 2022 |archive-date=14 December 2022 |archive-url=https://web.archive.org/web/20221214124131/https://ideas.repec.org//p/bir/birmec/20-06.html |url-status=live }}</ref><ref>{{Cite web |title=Information disorder: Toward an interdisciplinary framework for research and policy making |url=https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html |access-date=2022-12-14 |website=Council of Europe Publishing |language=en |archive-date=14 December 2022 |archive-url=https://web.archive.org/web/20221214125635/https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html |url-status=live }}</ref> these terms are abbreviated 'DMMI'.<ref>{{Cite journal |last=Newman |first=Hadley |orig-date=16 September 2021 |title=Understanding the Differences Between Disinformation, Misinformation, Malinformation and Information – Presenting the DMMI Matrix |url=https://committees.parliament.uk/writtenevidence/39289/html/ |journal=Draft Online Safety Bill (Joint Committee) |location=UK |publisher=UK Government |access-date=4 January 2023 |archive-date=4 January 2023 |archive-url=https://web.archive.org/web/20230104112358/https://committees.parliament.uk/writtenevidence/39289/html/ |url-status=live }}</ref> In 2019, [[Camille François]] devised the "ABC" framework of understanding different modalities of online disinformation: * Manipulative ''Actors'', who "engage knowingly and with clear intent in viral deception campaigns" that are "covert, designed to obfuscate the identity and intent of the actor orchestrating them." Examples include personas such as [[Guccifer 2.0]], [[Troll (slang)|Internet trolls]], [[state media]], and military operatives. * Deceptive ''Behavior'', which "encompasses the variety of techniques viral deception actors may use to enhance and exaggerate the reach, virality and impact of their campaigns." Examples include [[troll farm]]s, [[Internet bots]], [[astroturfing]], and "[[Facebook like button#Fake "likes"|paid engagement]]". * Harmful ''Content'', which includes [[Misinformation|health misinformation]], [[Media manipulation|manipulated media]] such as [[deepfakes]], [[Cyberbullying|online harassment]], [[violent extremism]], [[hate speech]] or [[terrorism]].<ref>{{Cite web |last=François |first=Camille |date=2019-09-20 |title=Actors, Behaviors, Content: A Disinformation ABC – Highlighting Three Vectors of Viral Deception to Guide Industry & Regulatory Responses |url=https://docs.house.gov/meetings/SY/SY21/20190926/109980/HHRG-116-SY21-Wstate-FrancoisC-20190926-SD001.pdf |archive-url=https://web.archive.org/web/20230321071912/https://docs.house.gov/meetings/SY/SY21/20190926/109980/HHRG-116-SY21-Wstate-FrancoisC-20190926-SD001.pdf |archive-date=2023-03-21 |access-date=2024-05-17}}</ref> In 2020, the [[Brookings Institution]] proposed amending this framework to include ''Distribution'', defined by the "technical protocols that enable, constrain, and shape user behavior in a virtual space".<ref>{{Cite web |last=Alaphilippe |first=Alexandre |date=2020-04-27 |title=Adding a 'D' to the ABC disinformation framework |url=https://www.brookings.edu/articles/adding-a-d-to-the-abc-disinformation-framework/ |archive-url=https://web.archive.org/web/20231027042531/https://www.brookings.edu/articles/adding-a-d-to-the-abc-disinformation-framework/ |archive-date=2023-10-27 |access-date=2024-05-18 |website=[[Brookings Institution]] |language=en-US}}</ref> Similarly, the [[Carnegie Endowment for International Peace]] proposed adding ''Degree'' ("distribution of the content ... and the audiences it reaches") and ''Effect'' ("how much of a threat a given case poses").<ref>{{Cite report |url=https://www.jstor.org/stable/resrep26180.6 |title=The ABCDE Framework |last=Pamment |first=James |date=2020 |publisher=[[Carnegie Endowment for International Peace]] |pages=5–9 |archive-url=https://web.archive.org/web/20240318053702/https://carnegieendowment.org/files/Pamment_-_Crafting_Disinformation_1.pdf |archive-date=2024-03-18}}</ref> ===Comparisons with propaganda=== Whether and to what degree disinformation and propaganda overlap is subject to debate. Some (like [[U.S. Department of State]]) define propaganda as the use of non-rational arguments to either advance or undermine a political ideal, and use disinformation as an alternative name for undermining propaganda,<ref>{{citation|url=https://www.state.gov/documents/organization/271028.pdf|date= May 2017|archive-date=30 March 2019|title=Can public diplomacy survive the internet?|archive-url= https://web.archive.org/web/20190330160444/https://www.state.gov/documents/organization/271028.pdf}}</ref>{{Page needed|date=March 2025}} while others consider them to be separate concepts altogether.<ref>{{citation|url=https://www.interpretermag.com/wp-content/uploads/2014/11/The_Menace_of_Unreality_Final.pdf|date=2014|title= The Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money|publisher=Institute of Modern Russia|archive-url=https://web.archive.org/web/20190203160823/https://www.interpretermag.com/wp-content/uploads/2014/11/The_Menace_of_Unreality_Final.pdf |archive-date=3 February 2019 }}</ref> One popular distinction holds that disinformation also describes politically motivated messaging designed explicitly to engender public cynicism, uncertainty, apathy, distrust, and paranoia, all of which disincentivize citizen engagement and mobilization for social or political change.<ref name=ned/> ==Practice== Disinformation is the label often given to foreign information manipulation and interference (FIMI).<ref name=":4">{{Cite book |last=Newman |first=Hadley |date=2022 |title=Foreign information manipulation and interference defence standards: Test for rapid adoption of the common language and framework 'DISARM' |url=https://stratcomcoe.org/publications/foreign-information-manipulation-and-interference-defence-standards-test-for-rapid-adoption-of-the-common-language-and-framework-disarm-prepared-in-cooperation-with-hybrid-coe/253 |series=NATO Strategic Communications Centre of Excellence |pages=60 |format=PDF |publication-place=Latvia |via=European Centre of Excellence for Countering Hybrid Threats |isbn=978-952-7472-46-0 |access-date=28 December 2022 |archive-date=28 December 2022 |archive-url=https://web.archive.org/web/20221228165610/https://stratcomcoe.org/publications/foreign-information-manipulation-and-interference-defence-standards-test-for-rapid-adoption-of-the-common-language-and-framework-disarm-prepared-in-cooperation-with-hybrid-coe/253 |url-status=live }}</ref><ref>{{Cite web |last=European Extrernal Action Service (EEAS) |date=27 October 2021 |title=Tackling Disinformation, Foreign Information Manipulation & Interference |url=https://www.eeas.europa.eu/eeas/tacklingdisinformation-foreign-information-manipulation-interference_en.}}</ref> Studies on disinformation are often concerned with the content of activity whereas the broader concept of FIMI is more concerned with the "behaviour of an actor" that is described through the [[military doctrine]] concept of [[Terrorist Tactics, Techniques, and Procedures|tactics, techniques, and procedures]] (TTPs).<ref name=":4" /> Disinformation is primarily carried out by government [[intelligence agencies]], but has also been used by non-governmental organizations and businesses.<ref name="jangoldman">{{citation|title=Words of Intelligence: A Dictionary|first=Jan|last=Goldman|chapter=Disinformation|isbn=978-0-8108-5641-7|year=2006|publisher=Scarecrow Press|page=43}}</ref> [[Front group]]s are a form of disinformation, as they mislead the public about their true objectives and who their controllers are.<ref name="eugeneasamier">{{citation|title=Secrecy and Tradecraft in Educational Administration: The Covert Side of Educational Life|page=176|first=Eugene A.|last=Samier|series=Routledge Research in Education|year=2014|isbn= 978-0-415-81681-6|publisher=Routledge}}</ref> Most recently, disinformation has been deliberately spread through social media in the form of "[[fake news]]", disinformation masked as legitimate news articles and meant to mislead readers or viewers.<ref>{{Cite journal|last1=Tandoc|first1=Edson C|last2=Lim|first2=Darren|last3=Ling|first3=Rich|date=2019-08-07|title=Diffusion of disinformation: How social media users respond to fake news and why|journal=Journalism|volume=21|issue=3|language=en|pages=381–398|doi=10.1177/1464884919868325|s2cid=202281476|issn=1464-8849}}</ref> Disinformation may include distribution of [[Forgery|forged]] [[document]]s, manuscripts, and photographs, or spreading dangerous [[rumour]]s and [[Fabrication (science)|fabricated]] [[intelligence]]. Use of these tactics can lead to [[Blowback (intelligence)|blowback]], however, causing such unintended consequences such as [[defamation]] lawsuits or damage to the dis-informer's reputation.<ref name="eugeneasamier" /> ==Worldwide== {{globalize section|date=October 2023}} ===Soviet disinformation=== {{excerpt|Soviet disinformation}} ===Russian disinformation=== {{excerpt|Russian disinformation}} === Chinese disinformation === {{Excerpt|Spamouflage|paragraphs=1}} ===American disinformation=== [[File:How Disinformation Can Be Spread.jpg|thumb|How Disinformation Can Be Spread, explanation by [[United States Department of Defense|U.S. Defense Department]] (2001)]] The [[United States Intelligence Community]] appropriated use of the term ''disinformation'' in the 1950s from the Russian ''dezinformatsiya'', and began to use similar strategies<ref name="manningromerstein">{{citation|pages=82–83|title=Historical Dictionary of American Propaganda|author1=Martin J. Manning |author2=Herbert Romerstein|chapter=Disinformation|year=2004|isbn=978-0-313-29605-5|publisher=Greenwood}}</ref><ref>{{citation|page=118|title=Right Words|first=Stephen |last=Murray-Smith|publisher=Viking|year=1989|isbn=978-0-670-82825-8}}</ref> during the Cold War and in conflict with other nations.<ref name=adamtaylor /> ''[[The New York Times]]'' reported in 2000 that during the CIA's effort to substitute [[Mohammed Reza Pahlavi]] for then-[[Prime Minister of Iran]] [[Mohammad Mossadegh]], the CIA placed fictitious stories in the local newspaper.<ref name=adamtaylor /> [[Reuters]] documented how, subsequent to the 1979 Soviet Union invasion of Afghanistan during the [[Soviet–Afghan War]], the CIA put false articles in newspapers of Islamic-majority countries, inaccurately stating that Soviet embassies had "invasion day celebrations".<ref name=adamtaylor /> Reuters noted a former U.S. intelligence officer said they would attempt to gain the confidence of reporters and use them as [[secret agent]]s, to affect a nation's politics by way of their local media.<ref name=adamtaylor /> In October 1986, the term gained increased currency in the U.S. when it was revealed that two months previously, the [[Presidency of Ronald Reagan|Reagan Administration]] had engaged in a disinformation campaign against then-leader of [[Libya]], [[Muammar Gaddafi]].<ref name=biagi>{{citation|title=Media/Impact: An Introduction to Mass Media|first=Shirley|last=Biagi|page=328|chapter=Disinformation|year=2014|publisher=Cengage Learning|isbn=978-1-133-31138-6}}</ref> [[White House]] representative [[Larry Speakes]] said reports of a planned attack on Libya as first broken by ''[[The Wall Street Journal]]'' on August 25, 1986, were "authoritative", and other newspapers including ''[[The Washington Post]]'' then wrote articles saying this was factual.<ref name=biagi /> [[United States Department of State|U.S. State Department]] representative [[Bernard Kalb]] resigned from his position in protest over the disinformation campaign, and said: "Faith in the word of America is the pulse beat of our democracy."<ref name=biagi /> The executive branch of the [[Presidency of Ronald Reagan|Reagan administration]] kept watch on disinformation campaigns through three yearly publications by the Department of State: ''Active Measures: A Report on the Substance and Process of Anti-U.S. Disinformation and Propaganda Campaigns'' (1986); ''Report on Active Measures and Propaganda, 1986–87'' (1987); and ''Report on Active Measures and Propaganda, 1987–88'' (1989).<ref name=manningromerstein /> According to a report by [[Reuters]], the United States ran a [[Propaganda in the United States|propaganda]] campaign to spread [[COVID-19 misinformation#US anti-vax anti-China covert operation|disinformation about the Sinovac Chinese COVID-19 vaccine]], including using fake social media accounts to spread the disinformation that the Sinovac vaccine contained pork-derived ingredients and was therefore ''[[haram]]'' under [[Sharia|Islamic law]].<ref name=":6">{{Cite news |last1=Bing |first1=Chris |last2=Schechtman |first2=Joel |date=June 14, 2024 |title=Pentagon Ran Secret Anti-Vax Campaign to Undermine China during Pandemic |url=https://www.reuters.com/investigates/special-report/usa-covid-propaganda/ |work=[[Reuters]]}}</ref> Reuters said the [[ChinaAngVirus disinformation campaign]] was designed to "counter what it perceived as China's growing influence in the Philippines" and was prompted by the "[fear] that China's [[Vaccine diplomacy|COVID diplomacy]] and [[COVID-19 misinformation by China|propaganda]] could draw other Southeast Asian countries, such as Cambodia and Malaysia, closer to Beijing".<ref name=":6" /> The campaign was also described as "payback for Beijing's efforts to blame Washington for the pandemic".<ref>{{Cite web |last=Toropin |first=Konstantin |date=2024-06-14 |title=Pentagon Stands by Secret Anti-Vaccination Disinformation Campaign in Philippines After Reuters Report |url=https://www.military.com/daily-news/2024/06/14/pentagon-stands-secret-anti-vaccination-disinformation-campaign-philippines-after-reuters-report.html |url-status=live |archive-url=https://web.archive.org/web/20240614223757/https://www.military.com/daily-news/2024/06/14/pentagon-stands-secret-anti-vaccination-disinformation-campaign-philippines-after-reuters-report.html |archive-date=2024-06-14 |access-date=2024-06-19 |website=[[Military.com]] |language=en}}</ref> The campaign primarily targeted people in the [[Philippines]] and used a social media [[hashtag]] for "China is the virus" in [[Tagalog language|Tagalog]].<ref name=":6" /> The campaign ran from 2020 to mid-2021.<ref name=":6" /> The primary contractor for the U.S. military on the project was [[General Dynamics|General Dynamics IT]], which received $493 million for its role.<ref name=":6" /> The politicisation of [[disinformation research]] in the United States. Since 2023, [[Republican Party (United States)|Republican]] members of the [[United States Congress|US Congress]] have attacked researchers who study disinformation as being against [[freedom of speech]] and as an euphemism for government [[censorship]].<ref>{{Cite news |last=Myers |first=Steven Lee |last2=Frenkel |first2=Sheera |date=2023-06-19 |title=G.O.P. Targets Researchers Who Study Disinformation Ahead of 2024 Election |url=https://www.nytimes.com/2023/06/19/technology/gop-disinformation-researchers-2024-election.html |access-date=2025-04-22 |work=The New York Times |language=en-US |issn=0362-4331}}</ref><ref>{{Cite web |last=Jankowicz |first=Nina |title=Republicans Are Obsessed with a Censorship Lie |url=https://www.thebulwark.com/p/republicans-are-obsessed-with-a-censorship-lie-twitter-files-taibbi-goebbels |access-date=2025-04-22 |website=www.thebulwark.com |language=en}}</ref> On April 18 2025, citing an Executive Order signed by Trump,<ref>{{Cite web |last=Scire |first=Sarah |title=National Science Foundation cancels research grants related to misinformation and disinformation |url=https://www.niemanlab.org/2025/04/national-science-foundation-cancels-research-grants-related-to-misinformation-and-disinformation/ |website=www.niemanlab.org}}</ref><ref>{{Cite news |title=A White House order claims to end 'censorship.' What does that mean? |url=https://www.npr.org/2025/01/24/nx-s1-5270071/eo-weaponization |access-date=2025-04-22 |work=NPR |language=en}}</ref> the US [[National Science Foundation]] released a statement cancelling funding for disinformation research,<ref>{{Cite web |title=Updates on NSF Priorities {{!}} NSF - National Science Foundation |url=https://www.nsf.gov/updates-on-priorities |access-date=2025-04-22 |website=www.nsf.gov |language=en}}</ref> citing it does not fit with the NSF priorities, "including but not limited to those on diversity, equity, and inclusion ([[Diversity, equity, and inclusion|DEI]]) and misinformation/disinformation."<ref>{{Cite web |last=Carnell |first=Henry |title=Government cancels disinformation grants in disinformation-filled statement |url=https://www.motherjones.com/politics/2025/04/nsf-science-disinformation-research-grants-cancelled/ |access-date=2025-04-22 |website=Mother Jones |language=en-US}}</ref> == Response == === Responses from cultural leaders === [[Pope Francis]] condemned disinformation in a 2016 interview, after being made the subject of a [[fake news website]] during the [[2016 United States presidential election|2016 U.S. election]] cycle which falsely claimed that he supported [[Donald Trump]].<ref name="popewarnsnyt">{{citation|url=https://www.nytimes.com/aponline/2016/12/07/world/europe/ap-eu-rel-vatican-fake-news.html|work=[[The New York Times]]|date=7 December 2016|access-date=7 December 2016|agency=[[Associated Press]]|title=Pope Warns About Fake News-From Experience|archive-date=7 December 2016|archive-url=https://web.archive.org/web/20161207180844/http://www.nytimes.com/aponline/2016/12/07/world/europe/ap-eu-rel-vatican-fake-news.html|url-status=live}}</ref><ref name="alyssanewcomb">{{citation|work=[[NBC News]]|access-date=16 November 2016|url=http://www.nbcnews.com/tech/tech-news/facebook-google-crack-down-fake-news-advertising-n684101|title=Facebook, Google Crack Down on Fake News Advertising|author=Alyssa Newcomb|publisher=NBC News|date=15 November 2016|archive-date=6 April 2019|archive-url=https://web.archive.org/web/20190406233151/https://www.nbcnews.com/tech/tech-news/facebook-google-crack-down-fake-news-advertising-n684101|url-status=live}}</ref><ref name="didthepope">{{citation|url=http://www.factcheck.org/2016/10/did-the-pope-endorse-trump/|access-date=7 December 2016|work=[[FactCheck.org]]|title=Did the Pope Endorse Trump?|first=Sydney|last=Schaede|date=24 October 2016|archive-date=19 April 2019|archive-url=https://web.archive.org/web/20190419232025/https://www.factcheck.org/2016/10/did-the-pope-endorse-trump/|url-status=live}}</ref> He stated that the worst thing the news media could do was spread disinformation and said the act was a [[sin]],<ref name="popewarnsreuters">{{citation|work=[[Reuters]]|date=7 December 2016|access-date=7 December 2016|url=https://www.reuters.com/article/us-pope-media-idUSKBN13W1TU|title=Pope warns media over 'sin' of spreading fake news, smearing politicians|first=Philip|last=Pullella|archive-date=23 November 2020|archive-url=https://web.archive.org/web/20201123025949/https://www.reuters.com/article/us-pope-media-idUSKBN13W1TU|url-status=live}}</ref><ref name="harrietsherwood">{{citation|url=https://www.theguardian.com/world/2016/dec/07/pope-compares-fake-news-consumption-to-eating-faeces-coprophilia|work=[[The Guardian]]|date=7 December 2016|access-date=7 December 2016|title=Pope Francis compares fake news consumption to eating faeces|archive-date=7 March 2021|archive-url=https://web.archive.org/web/20210307225737/https://www.theguardian.com/world/2016/dec/07/pope-compares-fake-news-consumption-to-eating-faeces-coprophilia|url-status=live}}</ref> comparing those who spread disinformation to individuals who engage in [[coprophilia]].<ref name="popefranciscompares">{{citation|url=https://www.washingtonpost.com/news/acts-of-faith/wp/2016/12/07/pope-francis-compares-media-who-spread-fake-news-to-people-who-are-excited-by-feces|access-date=7 December 2016|date=7 December 2016|newspaper=[[The Washington Post]]|title=Pope Francis compares media that spread fake news to people who are excited by feces|first=Julie|last=Zauzmer|archive-date=4 February 2021|archive-url=https://web.archive.org/web/20210204225913/https://www.washingtonpost.com/news/acts-of-faith/wp/2016/12/07/pope-francis-compares-media-who-spread-fake-news-to-people-who-are-excited-by-feces/|url-status=live}}</ref><ref name="andrewgriffin">{{citation|url=https://www.independent.co.uk/news/people/pope-fake-news-francis-sexual-arousal-coprophilia-coprophagia-a7461331.html|access-date=7 December 2016|date=7 December 2016|work=[[The Independent]]|first=Andrew|last=Griffin|title=Pope Francis: Fake news is like getting sexually aroused by faeces|archive-date=26 January 2021|archive-url=https://web.archive.org/web/20210126190844/https://www.independent.co.uk/news/people/pope-fake-news-francis-sexual-arousal-coprophilia-coprophagia-a7461331.html|url-status=live}}</ref> === Ethics in warfare === In a contribution to the 2014 book ''Military Ethics and Emerging Technologies'', writers David Danks and Joseph H. Danks discuss the ethical implications in using disinformation as a tactic during [[information warfare]].<ref name="daviddanks">{{citation|last1=Danks|first1=David|title=Military Ethics and Emerging Technologies|pages=223–224|year=2014|editor1=Timothy J. Demy|chapter=The Moral Responsibility of Automated Responses During Cyberwarfare|publisher=Routledge|isbn=978-0-415-73710-4|last2=Danks|first2=Joseph H.|editor2=George R. Lucas Jr.|editor3=Bradley J. Strawser}}</ref> They note there has been a significant degree of philosophical debate over the issue as related to the [[ethics of war]] and use of the technique.<ref name="daviddanks" /> The writers describe a position whereby the use of disinformation is occasionally allowed, but not in all situations.<ref name="daviddanks" /> Typically the ethical test to consider is whether the disinformation was performed out of a motivation of [[good faith]] and acceptable according to the [[rules of war]].<ref name="daviddanks" /> By this test, the tactic during World War II of putting fake inflatable tanks in visible locations on the [[Pacific Islands]] in order to falsely present the impression that there were larger military forces present would be considered as ethically permissible.<ref name="daviddanks" /> Conversely, disguising a munitions plant as a healthcare facility in order to avoid attack would be outside the bounds of acceptable use of disinformation during war.<ref name="daviddanks" /> == Research == {{Main|Disinformation research}} [[File:Disinformation and echo chambers.jpg|thumb|Disinformation spreads through controversies.<ref name="DiazRuizNilsson2022" /> <p>[https://doi.org/10.1177/07439156221103852 Figure 1. A Framework of How Disinformation Disseminates on Social Media Through Echo Chambers] by Diaz Ruiz & Nilsson is licensed under [http://creativecommons.org/licenses/by/4.0/ CC BY 4.0]</p>]] Research related to disinformation studies is increasing as an applied area of inquiry.<ref>{{Cite journal|url=https://mediawell.ssrc.org/literature-reviews/defining-disinformation/versions/1-0/|title=Defining "Disinformation", V1.0|last=Spies|first=Samuel|date=2019-08-14|website=MediaWell, Social Science Research Council|language=en|access-date=2019-11-09|archive-date=30 October 2020|archive-url=https://web.archive.org/web/20201030094140/https://mediawell.ssrc.org/literature-reviews/defining-disinformation/versions/1-0/|url-status=live}}</ref><ref>{{Cite journal|last=Tandoc|first=Edson C.|date=2019|title=The facts of fake news: A research review|journal=Sociology Compass|language=en|volume=13|issue=9|pages=e12724|doi=10.1111/soc4.12724|s2cid=201392983|issn=1751-9020}}</ref> The call to formally classify disinformation as a cybersecurity [[Threat (computer)|threat]] is made by advocates due to its increase in social networking sites.<ref>{{Cite book|last=Caramancion|first=Kevin Matthe|title=2020 3rd International Conference on Information and Computer Technologies (ICICT) |chapter=An Exploration of Disinformation as a Cybersecurity Threat |date=2020|pages=440–444|doi=10.1109/ICICT50521.2020.00076|isbn=978-1-7281-7283-5|s2cid=218651389}}</ref> Despite the proliferation of social media websites, Facebook and Twitter showed the most activity in terms of active disinformation campaigns. Techniques reported on included the use of bots to amplify hate speech, the illegal harvesting of data, and paid trolls to harass and threaten journalists.<ref>{{Cite web|url=https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2019/09/CyberTroop-Report19.pdf|title=Samantha Bradshaw & Philip N. Howard. (2019) The Global Disinformation Disorder: 2019 Global Inventory of Organised Social Media Manipulation. Working Paper 2019.2. Oxford, UK: Project on Computational Propaganda|website=comprop.oii.ox.ac.uk|accessdate=17 November 2022|archive-date=25 May 2022|archive-url=https://web.archive.org/web/20220525105011/https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/93/2019/09/CyberTroop-Report19.pdf|url-status=live}}</ref> Whereas disinformation research focuses primarily on how actors orchestrate deceptions on social media, primarily via [[fake news]], new research investigates how people take what started as deceptions and circulate them as their personal views.<ref name="DiazRuizNilsson2022">{{Cite journal |last1=Diaz Ruiz |first1=Carlos |last2=Nilsson |first2=Tomas |date=16 May 2022 |title=Disinformation and Echo Chambers: How Disinformation Circulates in Social Media Through Identity-Driven Controversies |url=http://journals.sagepub.com/doi/10.1177/07439156221103852 |journal=Journal of Public Policy & Marketing |volume=42 |pages=18–35 |doi=10.1177/07439156221103852 |s2cid=248934562 |access-date=20 June 2022 |archive-date=20 June 2022 |archive-url=https://web.archive.org/web/20220620070343/https://journals.sagepub.com/doi/10.1177/07439156221103852 |url-status=live }}</ref> As a result, research shows that disinformation can be conceptualized as a program that encourages engagement in oppositional fantasies (i.e., [[culture war]]s), through which disinformation circulates as rhetorical ammunition for never-ending arguments.<ref name="DiazRuizNilsson2022" /> As disinformation entangles with [[culture war]]s, identity-driven controversies constitute a vehicle through which disinformation disseminates on [[social media]]. This means that disinformation thrives, not despite raucous grudges but because of them. The reason is that controversies provide fertile ground for never-ending debates that solidify points of view.<ref name="DiazRuizNilsson2022" /> Scholars have pointed out that disinformation is not only a foreign threat as domestic purveyors of disinformation are also leveraging traditional media outlets such as newspapers, radio stations, and television news media to disseminate false information.<ref>{{Cite journal |last1=Miller |first1=Michael L. |last2=Vaccari |first2=Cristian |date=July 2020 |title=Digital Threats to Democracy: Comparative Lessons and Possible Remedies |url=http://journals.sagepub.com/doi/10.1177/1940161220922323 |journal=The International Journal of Press/Politics |language=en |volume=25 |issue=3 |pages=333–356 |doi=10.1177/1940161220922323 |s2cid=218962159 |issn=1940-1612 |access-date=14 December 2022 |archive-date=14 December 2022 |archive-url=https://web.archive.org/web/20221214125205/https://journals.sagepub.com/doi/10.1177/1940161220922323 |url-status=live }}</ref> Current research suggests right-wing online political [[Activism|activists]] in the United States may be more likely to use disinformation as a strategy and tactic.<ref>{{Cite journal|last1=Freelon|first1=Deen|last2=Marwick|first2=Alice|last3=Kreiss|first3=Daniel|date=2020-09-04|title=False equivalencies: Online activism from left to right|url=https://www.science.org/doi/abs/10.1126/science.abb2428|journal=Science|volume=369|issue=6508|pages=1197–1201|language=EN|doi=10.1126/science.abb2428|pmid=32883863|bibcode=2020Sci...369.1197F|s2cid=221471947|access-date=2 February 2022|archive-date=21 October 2021|archive-url=https://web.archive.org/web/20211021142705/https://www.science.org/doi/abs/10.1126/science.abb2428|url-status=live|url-access=subscription}}</ref> Governments have responded with a wide range of policies to address concerns about the potential threats that disinformation poses to democracy, however, there is little agreement in elite policy discourse or academic literature as to what it means for disinformation to threaten democracy, and how different policies might help to counter its negative implications.<ref>{{Cite journal |last=Tenove |first=Chris |date=July 2020 |title=Protecting Democracy from Disinformation: Normative Threats and Policy Responses |url=http://journals.sagepub.com/doi/10.1177/1940161220918740 |journal=The International Journal of Press/Politics |language=en |volume=25 |issue=3 |pages=517–537 |doi=10.1177/1940161220918740 |s2cid=219437151 |issn=1940-1612 |access-date=14 December 2022 |archive-date=14 December 2022 |archive-url=https://web.archive.org/web/20221214125205/https://journals.sagepub.com/doi/10.1177/1940161220918740 |url-status=live |url-access=subscription }}</ref> === Consequences of exposure to disinformation online === There is a broad consensus amongst scholars that there is a high degree of disinformation, misinformation, and propaganda online; however, it is unclear to what extent such disinformation has on political attitudes in the public and, therefore, political outcomes.<ref name=":1">{{Cite journal|last1=Tucker|first1=Joshua|last2=Guess|first2=Andrew|last3=Barbera|first3=Pablo|last4=Vaccari|first4=Cristian|last5=Siegel|first5=Alexandra|last6=Sanovich|first6=Sergey|last7=Stukal|first7=Denis|last8=Nyhan|first8=Brendan|date=2018|title=Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature|journal=SSRN Working Paper Series|url=https://www.ssrn.com/abstract=3144139|doi=10.2139/ssrn.3144139|issn=1556-5068|access-date=29 October 2019|archive-date=21 February 2021|archive-url=https://web.archive.org/web/20210221202942/https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3144139|url-status=live}}</ref> This [[conventional wisdom]] has come mostly from investigative journalists, with a particular rise during the 2016 U.S. election: some of the earliest work came from Craig Silverman at Buzzfeed News.<ref>{{Cite web|url=https://www.buzzfeednews.com/article/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook|title=This Analysis Shows How Viral Fake Election News Stories Outperformed Real News On Facebook|website=BuzzFeed News|date=16 November 2016|access-date=2019-10-29|archive-date=17 July 2018|archive-url=https://web.archive.org/web/20180717155014/https://www.buzzfeed.com/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook|url-status=live}}</ref> Cass Sunstein supported this in ''#Republic,'' arguing that the internet would become rife with [[echo chamber (media)|echo chambers]] and informational cascades of misinformation leading to a highly polarized and ill-informed society.<ref>{{Cite book|title=#Republic : divided democracy in the age of social media|last=Sunstein, Cass R.|isbn=978-0691175515|location=Princeton|oclc=958799819|date=14 March 2017|url-access=registration|url=https://archive.org/details/republi_sun_2017_00_0042}}</ref> Research after the 2016 election found: (1) for 14 percent of Americans social media was their "most important" source of election news; 2) known false news stories "favoring Trump were shared a total of 30 million times on Facebook, while those favoring Clinton were shared 8 million times"; 3) the average American adult saw fake news stories, "with just over half of those who recalled seeing them believing them"; and 4) people are more likely to "believe stories that favor their preferred candidate, especially if they have ideologically segregated social media networks."<ref>{{Cite journal|last1=Allcott|first1=Hunt|last2=Gentzkow|first2=Matthew|date=May 2017|title=Social Media and Fake News in the 2016 Election|journal=Journal of Economic Perspectives|language=en|volume=31|issue=2|pages=211–236|doi=10.1257/jep.31.2.211|s2cid=32730475|issn=0895-3309|doi-access=free}}</ref> Correspondingly, whilst there is wide agreement that the digital spread and uptake of disinformation during the 2016 election was massive and very likely facilitated by foreign agents, there is an ongoing debate on whether all this had any actual effect on the election. For example, a double blind randomized-control experiment by researchers from the London School of Economics (LSE), found that exposure to online fake news about either Trump or Clinton had no significant effect on intentions to vote for those candidates. Researchers who examined the influence of Russian disinformation on Twitter during the 2016 US presidential campaign found that exposure to disinformation was (1) concentrated among a tiny group of users, (2) primarily among Republicans, and (3) eclipsed by exposure to legitimate political news media and politicians. Finally, they find "no evidence of a meaningful relationship between exposure to the Russian foreign influence campaign and changes in attitudes, polarization, or voting behavior."<ref>{{Cite journal|last1=Eady|first1=Gregory|last2=Paskhalis|first2=Tom|last3=Zilinsky|first3=Jan|last4=Bonneau|first4=Richard|last5=Nagler|first5=Jonathan|last6=Tucker|first6=Joshua A.|date=2023-01-09|title=Exposure to the Russian Internet Research Agency Foreign Influence Campaign on Twitter in the 2016 US Election and its Relationship to Attitudes and Voting Behavior|journal=Nature Communications|volume=14|issue=62|page=62 |doi=10.1038/s41467-022-35576-9|pmid=36624094 |pmc=9829855 |bibcode=2023NatCo..14...62E |doi-access=free}}</ref> As such, despite its mass dissemination during the 2016 Presidential Elections, online fake news or disinformation probably did not cost Hillary Clinton the votes needed to secure the presidency.<ref>{{cite journal |last1=Leyva |first1=Rodolfo |title=Testing and unpacking the effects of digital fake news: on presidential candidate evaluations and voter support |journal=AI & Society |date=2020 |volume=35 |issue=4 |page=970 |doi=10.1007/s00146-020-00980-6 |s2cid=218592685 |doi-access=free }}</ref> Research on this topic remains inconclusive, for example, misinformation appears not to significantly change political knowledge of those exposed to it.<ref>{{Cite journal|last1=Allcott|first1=Hunt|last2=Gentzkow|first2=Matthew|date=May 2017|title=Social Media and Fake News in the 2016 Election|journal=Journal of Economic Perspectives|volume=31|issue=2|pages=211–236|doi=10.1257/jep.31.2.211|issn=0895-3309|doi-access=free}}</ref> There seems to be a higher level of diversity of news sources that users are exposed to on Facebook and Twitter than conventional wisdom would dictate, as well as a higher frequency of cross-spectrum discussion.<ref>{{Cite journal|last1=Bakshy|first1=E.|last2=Messing|first2=S.|last3=Adamic|first3=L. A.|date=2015-06-05|title=Exposure to ideologically diverse news and opinion on Facebook|journal=Science|volume=348|issue=6239|pages=1130–1132|doi=10.1126/science.aaa1160|pmid=25953820|issn=0036-8075|bibcode=2015Sci...348.1130B|s2cid=206632821|doi-access=free}}</ref><ref>{{Cite journal|last1=Wojcieszak|first1=Magdalena E.|last2=Mutz|first2=Diana C.|date=2009-03-01|title=Online Groups and Political Discourse: Do Online Discussion Spaces Facilitate Exposure to Political Disagreement?|journal=Journal of Communication|volume=59|issue=1|pages=40–56|doi=10.1111/j.1460-2466.2008.01403.x|s2cid=18865773 |issn=0021-9916}}</ref> Other evidence has found that disinformation campaigns rarely succeed in altering the foreign policies of the targeted states.<ref name=":0">{{Cite journal|last=Lanoszka|first=Alexander|date=2019|title=Disinformation in international politics|journal=European Journal of International Security|volume=4|issue=2|pages=227–248|doi=10.1017/eis.2019.6|s2cid=211312944|issn=2057-5637}}</ref> Research is also challenging because disinformation is meant to be difficult to detect and some social media companies have discouraged outside research efforts.<ref name=":2">{{Cite journal|last1=Shu|first1=Kai|last2=Sliva|first2=Amy|last3=Wang|first3=Suhang|last4=Tang|first4=Jiliang|last5=Liu|first5=Huan|date=2017-09-01|title=Fake News Detection on Social Media: A Data Mining Perspective|url=https://doi.org/10.1145/3137597.3137600|journal=ACM SIGKDD Explorations Newsletter|volume=19|issue=1|pages=22–36|doi=10.1145/3137597.3137600|arxiv=1708.01967|s2cid=207718082|issn=1931-0145|access-date=1 February 2022|archive-date=5 February 2022|archive-url=https://web.archive.org/web/20220205204457/https://dl.acm.org/doi/10.1145/3137597.3137600|url-status=live}}</ref> For example, researchers found disinformation made "existing detection algorithms from traditional news media ineffective or not applicable...[because disinformation] is intentionally written to mislead readers...[and] users' social engagements with fake news produce data that is big, incomplete, unstructured, and noisy."<ref name=":2" /> Facebook, the largest social media company, has been criticized by [[Analytic journalism|analytical journalists]] and scholars for preventing outside research of disinformation.<ref>{{Cite web|last1=Edelson|first1=Laura|last2=McCoy|first2=Damon|title=How Facebook Hinders Misinformation Research|url=https://www.scientificamerican.com/article/how-facebook-hinders-misinformation-research/|access-date=2022-02-01|website=Scientific American|language=en|archive-date=2 February 2022|archive-url=https://web.archive.org/web/20220202025821/https://www.scientificamerican.com/article/how-facebook-hinders-misinformation-research/|url-status=live}}</ref><ref>{{Cite web|last1=Edelson|first1=Laura|last2=McCoy|first2=Damon|date=2021-08-14|title=Facebook shut down our research into its role in spreading disinformation|url=http://www.theguardian.com/technology/2021/aug/14/facebook-research-disinformation-politics|access-date=2022-02-01|website=The Guardian|language=en|archive-date=24 March 2022|archive-url=https://web.archive.org/web/20220324171518/https://www.theguardian.com/technology/2021/aug/14/facebook-research-disinformation-politics|url-status=live}}</ref><ref>{{Cite journal|last1=Krishnan|first1=Nandita|last2=Gu|first2=Jiayan|last3=Tromble|first3=Rebekah|last4=Abroms|first4=Lorien C.|date=2021-12-15|title=Research note: Examining how various social media platforms have responded to COVID-19 misinformation|url=https://misinforeview.hks.harvard.edu/article/research-note-examining-how-various-social-media-platforms-have-responded-to-covid-19-misinformation/|journal=Harvard Kennedy School Misinformation Review|language=en-US|doi=10.37016/mr-2020-85|s2cid=245256590|doi-access=free|access-date=1 February 2022|archive-date=3 February 2022|archive-url=https://web.archive.org/web/20220203040557/https://misinforeview.hks.harvard.edu/article/research-note-examining-how-various-social-media-platforms-have-responded-to-covid-19-misinformation/|url-status=live}}</ref><ref>{{Cite news|title=Only Facebook knows the extent of its misinformation problem. And it's not sharing, even with the White House.|language=en-US|newspaper=Washington Post|url=https://www.washingtonpost.com/technology/2021/08/19/facebook-data-sharing-struggle/|access-date=2022-02-01|issn=0190-8286|archive-date=5 February 2022|archive-url=https://web.archive.org/web/20220205042625/https://www.washingtonpost.com/technology/2021/08/19/facebook-data-sharing-struggle/|url-status=live}}</ref> === Alternative perspectives and critiques === Researchers have criticized the framing of disinformation as being limited to technology platforms, removed from its wider political context and inaccurately implying that the media landscape was otherwise well-functioning.<ref>{{Cite journal |last1=Kuo |first1=Rachel |last2=Marwick |first2=Alice |date=2021-08-12 |title=Critical disinformation studies: History, power, and politics |url=https://misinforeview.hks.harvard.edu/article/critical-disinformation-studies-history-power-and-politics/ |journal=Harvard Kennedy School Misinformation Review |language=en-US |doi=10.37016/mr-2020-76 |archive-url=https://web.archive.org/web/20231015223538/https://misinforeview.hks.harvard.edu/article/critical-disinformation-studies-history-power-and-politics/ |archive-date=2023-10-15 |doi-access=free}}</ref> "The field possesses a simplistic understanding of the effects of media technologies; overemphasizes platforms and underemphasizes politics; focuses too much on the United States and Anglocentric analysis; has a shallow understanding of political culture and culture in general; lacks analysis of race, class, gender, and sexuality as well as status, inequality, social structure, and power; has a thin understanding of journalistic processes; and, has progressed more through the exigencies of grant funding than the development of theory and empirical findings."<ref>{{Cite web |title=What Comes After Disinformation Studies? |url=https://citap.unc.edu/events/ica-preconference-2022/ |archive-url=https://web.archive.org/web/20230203095200/https://citap.unc.edu/ica-preconference-2022/ |archive-date=2023-02-03 |access-date=2024-01-16 |website=Center for Information, Technology, & Public Life (CITAP), University of North Carolina at Chapel Hill |language=en}}</ref> Alternative perspectives have been proposed: # Moving beyond ''[[fact-checking]] and [[media literacy]]'' to study a pervasive phenomenon as something that involves more than news consumption. # Moving beyond ''technical solutions'' including AI-enhanced [[fact checking]] to understand the systemic basis of disinformation. # Develop a theory that goes beyond ''[[Americentrism]]'' to develop a global perspective, understand cultural imperialism and Third World dependency on Western news'',<ref>{{Cite web |last=Tworek |first=Heidi |date=2022-08-02 |title=Can We Move Beyond Disinformation Studies? |url=https://www.cigionline.org/articles/can-we-move-beyond-disinformation-studies/ |archive-url=https://web.archive.org/web/20230601164258/https://www.cigionline.org/articles/can-we-move-beyond-disinformation-studies/ |archive-date=2023-06-01 |access-date=2024-01-16 |website=[[Centre for International Governance Innovation]]}}</ref>'' and understand disinformation in the Global South.<ref>{{Cite book |url=https://onlinelibrary.wiley.com/doi/book/10.1002/9781119714491 |title=Disinformation in the Global South |date=2022-04-12 |publisher=Wiley |isbn=978-1-119-71444-6 |editor-last=Wasserman |editor-first=Herman |edition=1 |language=en |doi=10.1002/9781119714491 |editor-last2=Madrid-Morales |editor-first2=Dani |access-date=4 March 2024 |archive-date=4 March 2024 |archive-url=https://web.archive.org/web/20240304064723/https://onlinelibrary.wiley.com/doi/book/10.1002/9781119714491 |url-status=live }}</ref> # Develop ''market-oriented disinformation research'' that examines the financial incentives and [[business model]]s that nudge content creators and [[Digital platform (infrastructure)|digital platforms]] to circulate disinformation online.<ref name=":5" /><ref name=":7" /> # ''Include a multidisciplinary approach'', involving [[history]], [[political economy]], [[ethnic studies]], [[feminist studies]], and [[science and technology studies]]. # Develop understandings of ''Gendered-based disinformation (GBD)'' defined as "the dissemination of false or misleading information attacking women (especially political leaders, journalists and public figures), basing the attack on their identity as women."<ref>{{Cite web |last=Sessa |first=Maria Giovanna |date=2020-12-04 |title=Misogyny and Misinformation: An analysis of gendered disinformation tactics during the COVID-19 pandemic |url=https://www.disinfo.eu/publications/misogyny-and-misinformation:-an-analysis-of-gendered-disinformation-tactics-during-the-covid-19-pandemic/ |archive-url=https://web.archive.org/web/20230919005420/https://www.disinfo.eu/publications/misogyny-and-misinformation:-an-analysis-of-gendered-disinformation-tactics-during-the-covid-19-pandemic/ |archive-date=2023-09-19 |access-date=2024-01-16 |website=EU DisinfoLab |language=en-US}}</ref><ref>{{Cite web |last=Sessa |first=Maria Giovanna |date=2022-01-26 |title=What is Gendered Disinformation? |url=https://il.boell.org/en/2022/01/26/what-gendered-disinformation |archive-url=https://web.archive.org/web/20220721155302/https://il.boell.org/en/2022/01/26/what-gendered-disinformation |archive-date=2022-07-21 |access-date=2024-01-16 |website=[[Heinrich Böll Foundation]] |language=en}}</ref> == Strategies for spreading disinformation == === Disinformation attack === {{main|Disinformation attack}} The research literature on how disinformation spreads is growing.<ref name=":1" /> Studies show that disinformation spread in social media can be classified into two broad stages: seeding and echoing.<ref name="DiazRuizNilsson2022" /> "Seeding", when malicious actors strategically insert deceptions, like fake news, into a social media ecosystem, and "echoing" is when the audience disseminates disinformation argumentatively as their own opinions often by incorporating disinformation into a confrontational fantasy. === Internet manipulation === {{excerpt|Internet manipulation}} Studies show four main methods of seeding disinformation online:<ref name=":1" /> # Selective censorship # Manipulation of search rankings # Hacking and releasing # Directly Sharing Disinformation === Exploiting online advertising technologies === Disinformation is amplified online due to malpractice concerning [[online advertising]], especially the [[Machine to machine|machine-to-machine]] interactions of [[real-time bidding]] systems.<ref>{{Cite journal |last1=Braun |first1=Joshua A. |last2=Eklund |first2=Jessica L. |date=2019-01-02 |title=Fake News, Real Money: Ad Tech Platforms, Profit-Driven Hoaxes, and the Business of Journalism |url=https://www.tandfonline.com/doi/full/10.1080/21670811.2018.1556314 |journal=Digital Journalism |language=en |volume=7 |issue=1 |pages=1–21 |doi=10.1080/21670811.2018.1556314 |issn=2167-0811}}</ref> Online advertising technologies have been used to amplify disinformation due to the financial incentives and [[monetization]] of [[user-generated content]] and [[fake news]].<ref name=":7">{{Cite journal |last=Diaz Ruiz |first=Carlos A. |date=2024-10-30 |title=Disinformation and fake news as externalities of digital advertising: a close reading of sociotechnical imaginaries in programmatic advertising |journal=Journal of Marketing Management |language=en |pages=1–23 |doi=10.1080/0267257X.2024.2421860 |issn=0267-257X|doi-access=free }}</ref> The lax oversight over the online advertising market can be used to amplify disinformation, including the use of [[dark money]] used for [[Social media use in politics|political advertising]].<ref>{{Cite web |last1=Nadler |first1=Anthony |last2=Donovan |first2=Joan |last3=Crane |first3=Matthew |date=2018-10-17 |title=Weaponizing the Digital Influence Machine |url=https://datasociety.net/library/weaponizing-the-digital-influence-machine/ |access-date=2024-11-21 |website=Data & Society |language=en-US}}</ref> ==See also== {{div col|colwidth=20em}}<!---♦♦♦ Please keep the list in alphabetical order ♦♦♦---> * [[Active Measures Working Group]] * [[Agitprop]] * [[Artificial intelligence and elections]] * [[Chinese information operations and information warfare]] * [[Counter Misinformation Team]] * [[Demoralization (warfare)]] * [[Denial and deception]] * [[Disinformation in the Russian invasion of Ukraine]] * [[The Disinformation Project]] * [[False flag]] * [[Fear, uncertainty and doubt]] * [[Gaslighting]] * [[Internet manipulation]] * [[Knowledge falsification]] * [[Media manipulation]] * [[Military deception]] * [[Post-truth politics]] * [[Social engineering (political science)]] * [[State-sponsored Internet propaganda]] {{div col end}} ==Notes== {{notelist}} ==References== {{Reflist|2}} ==Further reading== {{Library resources box}} * {{citation|first=Ladislav|last=Bittman|author-link=Lawrence Martin-Bittman|title=The KGB and Soviet Disinformation: An Insider's View|year=1985|isbn=978-0-08-031572-0|publisher=Pergamon-Brassey's|title-link=The KGB and Soviet Disinformation}} * {{citation|url=https://commons.wikimedia.org/wiki/File:Operation_INFEKTION_-_Soviet_Bloc_Intelligence_and_Its_AIDS_Disinformation_Campaign.pdf|access-date=9 December 2016|title=Operation INFEKTION – Soviet Bloc Intelligence and Its AIDS Disinformation Campaign|first=Thomas|last=Boghardt|date=26 January 2010|journal=[[Studies in Intelligence]]|volume=53|issue=4}} * {{citation|title=New Lies for Old: The Communist Strategy of Deception and Disinformation|year=1984|author-link=Anatoliy Golitsyn|first=Anatoliy|last=Golitsyn|publisher=Dodd, Mead & Company|isbn=978-0-396-08194-4}} * [[Cailin O'Connor|O'Connor, Cailin]], and [[James Owen Weatherall]], "Why We Trust Lies: The most effective misinformation starts with seeds of [[truth]]", ''[[Scientific American]]'', vol. 321, no. 3 (September 2019), pp. 54–61. * {{citation|title=Disinformation: Former Spy Chief Reveals Secret Strategies for Undermining Freedom, Attacking Religion, and Promoting Terrorism|year=2013|author= [[Ion Mihai Pacepa]] and [[Ronald J. Rychlak]]|isbn=978-1-936488-60-5|publisher=WND Books|title-link=Disinformation (book)}} * {{citation|title=Deception, Disinformation, and Strategic. Communications: How One Interagency Group. Made a Major Difference |author1=Fletcher Schoen |author2=Christopher J. Lamb|journal=Strategic Perspectives|volume= 11|date= 1 June 2012|url=https://commons.wikimedia.org/wiki/File:Deception,_Disinformation,_and_Strategic_Communications.pdf|access-date=9 December 2016}} * {{citation|title=Dezinformatsia: Active Measures in Soviet Strategy|year=1984|first1=Richard H.|last1=Shultz|first2=Roy|last2=Godson|publisher=Pergamon-Brassey's|isbn=978-0080315737|author1-link=Richard H. Shultz|author2-link=Roy Godson|title-link=Dezinformatsia (book)}} * {{citation|url=https://www.washingtonpost.com/news/worldviews/wp/2016/11/26/before-fake-news-there-was-soviet-disinformation|newspaper=[[The Washington Post]]|date=26 November 2016|access-date=3 December 2016|title=Before 'fake news,' there was Soviet 'disinformation'|first=Adam|last=Taylor}} * {{citation |url=https://shorensteincenter.org/the-fight-against-disinformation-in-the-u-s-a-landscape-analysis/ |title=The Fight Against Disinformation in the U.S.: A Landscape Analysis |date=1 November 2018 |first1=Heidi |last1=Legg |first2=Joe |last2=Kerwin |publisher=[[Shorenstein Center on Media, Politics and Public Policy|Harvard Kennedy School, Shorenstein Center]] |access-date=10 August 2020}} ==External links== {{wiktionary|disinformation}} {{Commons category|Disinformation}} {{Wikiquote}} * [http://www.bl.uk/learning/cult/disinfo/disinformation.html Disinformation] {{Webarchive|url=https://web.archive.org/web/20071025061754/http://www.bl.uk/learning/cult/disinfo/disinformation.html |date=25 October 2007 }} – a learning resource from the British Library including an interactive movie and activities. * [https://mediawell.ssrc.org/ MediaWell] – an initiative of the nonprofit [[Social Science Research Council]] seeking to track and curate disinformation, misinformation, and fake news research. * [https://commonslibrary.org/how-civil-society-can-counter-disinformation/ How Civil Society can Counter Disinformation] Commons Social Change Library {{Disinformation|state=expanded}} {{Propaganda}} {{Media culture}} {{Media manipulation}} {{Military deception}} {{Authority control}} [[Category:Disinformation| ]] [[Category:Deception]] [[Category:Communication of falsehoods]] [[Category:Media manipulation]] [[Category:Propaganda techniques]] [[Category:Black propaganda]] [[Category:1920s neologisms]] [[Category:Psychological warfare techniques]] [[Category:Intelligence operations by type]] [[Category:Lying]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Authority control
(
edit
)
Template:Citation
(
edit
)
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite news
(
edit
)
Template:Cite report
(
edit
)
Template:Cite web
(
edit
)
Template:Commons category
(
edit
)
Template:Creative Commons text attribution notice
(
edit
)
Template:Disinformation
(
edit
)
Template:Distinguish
(
edit
)
Template:Div col
(
edit
)
Template:Div col end
(
edit
)
Template:Excerpt
(
edit
)
Template:Globalize section
(
edit
)
Template:Lang
(
edit
)
Template:Library resources box
(
edit
)
Template:Main
(
edit
)
Template:Media culture
(
edit
)
Template:Media manipulation
(
edit
)
Template:Military deception
(
edit
)
Template:Notelist
(
edit
)
Template:Other uses
(
edit
)
Template:Page needed
(
edit
)
Template:Propaganda
(
edit
)
Template:Reflist
(
edit
)
Template:Short description
(
edit
)
Template:Sidebar with collapsible lists
(
edit
)
Template:Sister project
(
edit
)
Template:Use dmy dates
(
edit
)
Template:War
(
edit
)
Template:Webarchive
(
edit
)
Template:Wikiquote
(
edit
)
Template:Wiktionary
(
edit
)