Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Confirmation bias
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Associated effects and outcomes == === Polarization of opinion === {{Main|Attitude polarization}} When people with opposing views interpret new information in a biased way, their views can move even further apart. This is called "attitude polarization".<ref name="kuhn_lao" /> The effect was demonstrated by an experiment that involved drawing a series of red and black balls from one of two concealed "bingo baskets". Participants knew that one basket contained 60 percent black and 40 percent red balls; the other, 40 percent black and 60 percent red. The experimenters looked at what happened when balls of alternating color were drawn in turn, a sequence that does not favor either basket. After each ball was drawn, participants in one group were asked to state out loud their judgments of the probability that the balls were being drawn from one or the other basket. These participants tended to grow more confident with each successive drawβwhether they initially thought the basket with 60 percent black balls or the one with 60 percent red balls was the more likely source, their estimate of the probability increased. Another group of participants were asked to state probability estimates only at the end of a sequence of drawn balls, rather than after each ball. They did not show the polarization effect, suggesting that it does not necessarily occur when people simply hold opposing positions, but rather when they openly commit to them.<ref>{{Harvnb|Baron|2000|p=201}}</ref> A less abstract study was the Stanford biased interpretation experiment, in which participants with strong opinions about the death penalty read about mixed experimental evidence. Twenty-three percent of the participants reported that their views had become more extreme, and this self-reported shift correlated strongly with their initial attitudes.<ref name="lord1979">{{Citation |last1=Lord |first1=Charles G. |first2=Lee |last2 =Ross |first3=Mark R. |last3=Lepper |year=1979 |title=Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence |journal=Journal of Personality and Social Psychology |volume=37 |issue=11 |pages=2098β2109 |issn=0022-3514 |doi=10.1037/0022-3514.37.11.2098|citeseerx=10.1.1.372.1743 |s2cid=7465318 }}</ref> In later experiments, participants also reported their opinions becoming more extreme in response to ambiguous information. However, comparisons of their attitudes before and after the new evidence showed no significant change, suggesting that the self-reported changes might not be real.<ref name="taber_political">{{Citation |last1=Taber |first1=Charles S. |first2=Milton |last2=Lodge |date=July 2006 |title=Motivated skepticism in the evaluation of political beliefs |journal=American Journal of Political Science |volume=50 |issue=3 |pages=755β769 |issn=0092-5853 |doi=10.1111/j.1540-5907.2006.00214.x|citeseerx=10.1.1.472.7064 |s2cid=3770487 }}</ref><ref name="kuhn_lao">{{Citation |last1=Kuhn |first1=Deanna |first2=Joseph |last2=Lao |date=March 1996 |title=Effects of evidence on attitudes: Is polarization the norm? |journal=Psychological Science |volume=7 |issue=2 |pages=115β120 |doi=10.1111/j.1467-9280.1996.tb00340.x|s2cid=145659040 }}</ref><ref>{{Citation |last1=Miller |first1=A.G.|first2=J.W. |last2=McHoskey |first3=C.M. |last3=Bane |first4=T.G. |last4=Dowd |s2cid=14102789|year=1993 |title=The attitude polarization phenomenon: Role of response measure, attitude extremity, and behavioral consequences of reported attitude change |journal=Journal of Personality and Social Psychology |volume=64 |pages=561β574 |doi=10.1037/0022-3514.64.4.561 |issue=4}}</ref> Based on these experiments, Deanna Kuhn and Joseph Lao concluded that polarization is a real phenomenon but far from inevitable, only happening in a small minority of cases, and it was prompted not only by considering mixed evidence, but by merely thinking about the topic.<ref name="kuhn_lao"/> Charles Taber and Milton Lodge argued that the Stanford team's result had been hard to replicate because the arguments used in later experiments were too abstract or confusing to evoke an emotional response. The Taber and Lodge study used the emotionally charged topics of [[gun politics|gun control]] and [[affirmative action]].<ref name="taber_political" /> They measured the attitudes of their participants towards these issues before and after reading arguments on each side of the debate. Two groups of participants showed attitude polarization: those with strong prior opinions and those who were politically knowledgeable. In part of this study, participants chose which information sources to read, from a list prepared by the experimenters. For example, they could read arguments on gun control from the [[National Rifle Association of America]] and the [[Brady Campaign|Brady Anti-Handgun Coalition]]. Even when instructed to be even-handed, participants were more likely to read arguments that supported their existing attitudes than arguments that did not. This biased search for information correlated well with the polarization effect.<ref name="taber_political" /> The '''{{vanchor|backfire effect}}''' is a name for the finding that given evidence against their beliefs, people can reject the evidence and believe even more strongly.<ref>{{Citation|url=http://www.skepdic.com/backfireeffect.html|title=Backfire effect|work=[[The Skeptic's Dictionary]]|access-date=26 April 2012|archive-date=6 February 2017|archive-url=https://web.archive.org/web/20170206213300/http://www.skepdic.com/backfireeffect.html|url-status=live}}</ref><ref name="CJR backfire">{{Citation | url = https://www.cjr.org/behind_the_news/the_backfire_effect.php | title = The backfire effect | access-date = 1 May 2012 | last = Silverman | first = Craig | date = 17 June 2011 | work = Columbia Journalism Review | quote = When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger. | archive-date = 25 April 2012 | archive-url = https://web.archive.org/web/20120425224027/http://www.cjr.org/behind_the_news/the_backfire_effect.php | url-status = live }}</ref> The phrase was coined by [[Brendan Nyhan]] and Jason Reifler in 2010.<ref>Nyhan, B. & Reifler, J. (2010). 'When corrections fail: The persistence of political misperceptions". ''Political Behavior'', 32, 303β320</ref> However, subsequent research has since failed to replicate findings supporting the backfire effect.<ref>{{Cite news|url=https://educationblog.oup.com/theory-of-knowledge/facts-matter-after-all-rejecting-the-backfire-effect|title=Facts matter after all: rejecting the "backfire effect"|date=12 March 2018|work=Oxford Education Blog|access-date=23 October 2018|language=en-GB|archive-date=23 October 2018|archive-url=https://web.archive.org/web/20181023234412/https://educationblog.oup.com/theory-of-knowledge/facts-matter-after-all-rejecting-the-backfire-effect|url-status=live}}</ref> One study conducted out of the Ohio State University and George Washington University studied 10,100 participants with 52 different issues expected to trigger a backfire effect. While the findings did conclude that individuals are reluctant to embrace facts that contradict their already held ideology, no cases of backfire were detected.<ref name ="wood">{{Cite journal|last1=Wood|first1=Thomas|last2=Porter|first2=Ethan|date=2019|title=The elusive backfire effect: Mass attitudes' steadfast factual adherence|journal=Political Behavior| volume=41 | pages = 135β163 |doi=10.2139/ssrn.2819073|issn=1556-5068|mode=cs2 }}</ref> The backfire effect has since been noted to be a rare phenomenon rather than a common occurrence<ref>{{Cite web|url=https://www.poynter.org/news/fact-checking-doesnt-backfire-new-study-suggests|title=Fact-checking doesn't 'backfire,' new study suggests|website=Poynter|language=en|access-date=23 October 2018|date=2 November 2016|mode=cs2|archive-date=24 October 2018|archive-url=https://web.archive.org/web/20181024035251/https://www.poynter.org/news/fact-checking-doesnt-backfire-new-study-suggests|url-status=live}}</ref> (compare the [[Boomerang effect (psychology)|boomerang effect]]). === Persistence of discredited beliefs === {{main|Belief perseverance}} {{see also|Cognitive dissonance|Monty Hall problem}} {{Quote box |quote=Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases. |source=βLee Ross and Craig Anderson<ref name="shortcomings"/> |width=30% |align=right}} Confirmation biases provide one plausible explanation for the persistence of beliefs when the initial evidence for them is removed or when they have been sharply contradicted.<ref name ="nickerson"/>{{rp|187}} This belief perseverance effect has been first demonstrated experimentally by [[Leon Festinger|Festinger]], Riecken, and Schachter. These psychologists [[Participant observation|spent time with]] a cult whose members were convinced that the world would end on 21 December 1954. After the prediction failed, most believers still clung to their faith. Their book describing this research is aptly named ''[[When Prophecy Fails]]''.<ref>{{Citation | last=Festinger | first=Leon | title=When prophecy fails: A social and psychological study of a modern group that predicted the destruction of the world | publisher=New York: Harper Torchbooks.| year=1956}}</ref> The term ''belief perseverance'', however, was coined in a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their [[attitude change]] is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.<ref name="shortcomings">{{Citation|last1=Ross |first1=Lee |first2=Craig A. |last2=Anderson |title=Judgment under uncertainty: Heuristics and biases |journal=Science |volume=185 |issue=4157 |pages=1124β1131 |bibcode=1974Sci...185.1124T |doi=10.1126/science.185.4157.1124 |year=1974 |pmid=17835457|s2cid=143452957 }}.<br />{{Citation|editor1-first=Daniel |editor1-last=Kahneman |editor2-first=Paul |editor2-last=Slovic |editor3-first=Amos |editor3-last=Tversky |publisher=Cambridge University Press |year=1982 |chapter=Shortcomings in the attribution process: On the origins and maintenance of erroneous social assessments |isbn=978-0-521-28414-1 |oclc=7578020|title=Judgment under uncertainty: Heuristics and biases}}</ref> A common finding is that at least some of the initial belief remains even after a full debriefing.<ref name="kunda99">{{Harvnb|Kunda|1999|p=99}}</ref> In one experiment, participants had to distinguish between real and fake suicide notes. The feedback was random: some were told they had done well while others were told they had performed badly. Even after being fully debriefed, participants were still influenced by the feedback. They still thought they were better or worse than average at that kind of task, depending on what they had initially been told.<ref>{{Citation |last1=Ross |first1=Lee |first2=Mark R. |last2=Lepper |first3=Michael |last3=Hubbard |title=Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm |journal=Journal of Personality and Social Psychology |volume=32 |issn=0022-3514 |pages=880βis 892 |year=1975 |issue=5 |doi=10.1037/0022-3514.32.5.880 |pmid=1185517}} via {{Harvnb|Kunda|1999|p=99}}</ref> In another study, participants read [[job performance]] ratings of two firefighters, along with their responses to a [[risk aversion]] test.<ref name="shortcomings" /> This fictional data was arranged to show either a negative or positive association: some participants were told that a risk-taking firefighter did better, while others were told they did less well than a risk-averse colleague.<ref name="socialperseverance" /> Even if these two case studies were true, they would have been scientifically poor evidence for a conclusion about firefighters in general. However, the participants found them subjectively persuasive.<ref name="socialperseverance">{{Citation |title=Perseverance of social theories: The role of explanation in the persistence of discredited information |first1=Craig A. |last1=Anderson |first2=Mark R. |last2=Lepper |first3=Lee |last3=Ross |journal=Journal of Personality and Social Psychology |year=1980 |volume=39 |issue=6 |pages=1037β1049 |issn=0022-3514 |doi=10.1037/h0077720|citeseerx=10.1.1.130.933 }}</ref> When the case studies were shown to be fictional, participants' belief in a link diminished, but around half of the original effect remained.<ref name="shortcomings" /> Follow-up interviews established that the participants had understood the debriefing and taken it seriously. Participants seemed to trust the debriefing, but regarded the discredited information as irrelevant to their personal belief.<ref name="socialperseverance" /> The [[continued influence effect]] is the tendency for misinformation to continue to influence memory and reasoning about an event, despite the misinformation having been retracted or corrected. This occurs even when the individual believes the correction.<ref>{{cite journal | last=Cacciatore | first=Michael A. | title=Misinformation and public opinion of science and health: Approaches, findings, and future directions | journal=Proceedings of the National Academy of Sciences | volume=118 | issue=15 | date=9 April 2021 | issn=0027-8424 | doi=10.1073/pnas.1912437117 | page=e1912437117 | pmid=33837143 | pmc=8053916 | bibcode=2021PNAS..11812437C | quote=The CIE refers to the tendency for information that is initially presented as true, but later revealed to be false, to continue to affect memory and reasoning | quote-page=4 | mode=cs2| doi-access=free }}</ref> === Preference for early information === Experiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as "intelligent, industrious, impulsive, critical, stubborn, envious" than when they are given the same words in reverse order.<ref name="baron197">{{Harvnb|Baron|2000|pp=197β200}}</ref> This ''irrational primacy effect'' is independent of the [[serial position effect|primacy effect in memory]] in which the earlier items in a series leave a stronger memory trace.<ref name="baron197"/> Biased interpretation offers an explanation for this effect: seeing the initial evidence, people form a working hypothesis that affects how they interpret the rest of the information.<ref name ="nickerson"/>{{rp|187}} One demonstration of irrational primacy used colored chips supposedly drawn from two urns. Participants were told the color distributions of the urns, and had to estimate the probability of a chip being drawn from one of them.<ref name="baron197"/> In fact, the colors appeared in a prearranged order. The first thirty draws favored one urn and the next thirty favored the other.<ref name ="nickerson"/>{{rp|187}} The series as a whole was neutral, so rationally, the two urns were equally likely. However, after sixty draws, participants favored the urn suggested by the initial thirty.<ref name="baron197" /> Another experiment involved a slide show of a single object, seen as just a blur at first and in slightly better focus with each succeeding slide.<ref name="baron197"/> After each slide, participants had to state their best guess of what the object was. Participants whose early guesses were wrong persisted with those guesses, even when the picture was sufficiently in focus that the object was readily recognizable to other people.<ref name ="nickerson"/>{{rp|187}} === Illusory association between events === {{Main|Illusory correlation}} Illusory correlation is the tendency to see non-existent correlations in a set of data.<ref name=fine>{{Harvnb |Fine|2006|pp=66β70}}</ref> This tendency was first demonstrated in a series of experiments in the late 1960s.<ref name="plous164">{{Harvnb |Plous|1993|pp=164β166}}</ref> In one experiment, participants read a set of psychiatric case studies, including responses to the [[Rorschach inkblot test]]. The participants reported that the homosexual men in the set were more likely to report seeing buttocks, anuses or sexually ambiguous figures in the inkblots. In fact the fictional case studies had been constructed so that the homosexual men were no more likely to report this imagery or, in one version of the experiment, were less likely to report it than heterosexual men.<ref name=fine /> In a survey, a group of experienced psychoanalysts reported the same set of illusory associations with homosexuality.<ref name=fine /><ref name=plous164 /> Another study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero.<ref>{{Citation |last1=Redelmeir |first1=D.A. |first2=Amos |last2=Tversky |year=1996 |title=On the belief that arthritis pain is related to the weather |journal=Proceedings of the National Academy of Sciences |volume=93 |pages=2895β2896 |doi=10.1073/pnas.93.7.2895 |pmid=8610138 |issue=7|bibcode=1996PNAS...93.2895R |pmc=39730 |doi-access=free }} via {{Harvnb|Kunda|1999|p=127}}</ref> {| class="wikitable" style="width:250px;text-align:center;margin: 1em auto 1em auto" |+ Example |- ! Days !! Rain !! No rain |- ! Arthritis | 14 || 6 |- ! No arthritis | 7 || 2 |} This effect is a kind of biased interpretation, in that objectively neutral or unfavorable evidence is interpreted to support existing beliefs. It is also related to biases in hypothesis-testing behavior.<ref name="kunda127">{{Harvnb|Kunda|1999|pp=127β130}}</ref> In judging whether two events, such as illness and bad weather, are correlated, people rely heavily on the number of ''positive-positive'' cases: in this example, instances of both pain and bad weather. They pay relatively little attention to the other kinds of observation (of no pain or good weather).<ref name="plous162">{{Harvnb|Plous|1993|pp=162β164}}</ref> This parallels the reliance on positive tests in hypothesis testing.<ref name="kunda127" /> It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together.<ref name="kunda127" />
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)