Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Confirmation bias
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Types == === Biased search for information === [[File:Fred Barnard07.jpg|thumb|right|200px|alt=A drawing of a man sitting on a stool at a writing desk|Confirmation bias has been described as an internal "[[Wikt:yes man|yes man]]", echoing back a person's beliefs like [[Charles Dickens]]'s character [[Uriah Heep (character)|Uriah Heep]].<ref name="WSJ">{{Citation |title=How to ignore the yes-man in your head |first=Jason |last=Zweig |newspaper=[[Wall Street Journal]] |date=19 November 2009 |url=https://www.wsj.com/articles/SB10001424052748703811604574533680037778184 |access-date=13 June 2010 |archive-date=14 February 2015 |archive-url=https://web.archive.org/web/20150214052645/http://www.wsj.com/articles/SB10001424052748703811604574533680037778184 |url-status=live }}</ref>]] Experiments have found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current [[hypothesis]].<ref name ="nickerson"/>{{rp|177–178}}<ref name="kunda112" /> Rather than searching through all the relevant evidence, they phrase questions to receive an affirmative answer that supports their theory.<ref name="baron162" /> They look for the consequences that they would expect if their hypothesis was true, rather than what would happen if it was false.<ref name="baron162">{{Harvnb|Baron|2000 |pp=162–64}}</ref> For example, someone using yes/no questions to find a number they suspect to be the number 3 might ask, "Is it an [[odd number]]?" People prefer this type of question, called a "positive test", even when a negative test such as "Is it an even number?" would yield exactly the same information.<ref>{{Harvnb|Kida|2006|pp=162–65}}</ref> However, this does not mean that people seek tests that guarantee a positive answer. In studies where subjects could select either such pseudo-tests or genuinely diagnostic ones, they favored the genuinely diagnostic.<ref>{{Citation |last1=Devine |first1=Patricia G. |first2=Edward R. |last2=Hirt |first3=Elizabeth M. |last3=Gehrke |year=1990 |title=Diagnostic and confirmation strategies in trait hypothesis testing |journal=Journal of Personality and Social Psychology |volume=58 |issue=6 |pages=952–963 |issn=1939-1315 |doi=10.1037/0022-3514.58.6.952}}</ref><ref>{{Citation |last1=Trope |first1=Yaacov |first2=Miriam |last2=Bassok |year=1982 |title=Confirmatory and diagnosing strategies in social information gathering |journal=Journal of Personality and Social Psychology |volume=43 |issue=1 |pages=22–34 |issn=1939-1315 |doi=10.1037/0022-3514.43.1.22}}</ref> The preference for positive tests in itself is not a bias, since positive tests can be highly informative.<ref name="klaymanha" /> However, in combination with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true.<ref name="oswald82">{{Harvnb|Oswald|Grosjean|2004|pp=82–83}}</ref> In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior.<ref name="kunda112" /> Thus any search for evidence in favor of a hypothesis is likely to succeed.<ref name=oswald82 /> One illustration of this is the way the phrasing of a question can significantly change the answer.<ref name="kunda112">{{Harvnb|Kunda|1999|pp=112–115}}</ref> For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you ''un''happy with your social life?"<ref>{{Citation |last1=Kunda |first1=Ziva |first2=G.T. |last2 =Fong |first3=R. |last3=Sanitoso |first4=E. |last4=Reber |year=1993 |title=Directional questions direct self-conceptions |journal=[[Journal of Experimental Social Psychology]] |volume=29 |pages=62–63 |issn=0022-1031|doi=10.1006/jesp.1993.1004 }} via {{Harvnb|Fine|2006|pp=63–65}}</ref> Even a small change in a question's wording can affect how people search through available information, and hence the conclusions they reach. This was shown using a fictional child custody case.<ref name="shafir" /> Participants read that Parent A was moderately suitable to be the guardian in multiple ways. Parent B had a mix of salient positive and negative qualities: a close relationship with the child but a job that would take them away for long periods of time. When asked, "Which parent should have custody of the child?" the majority of participants chose Parent B, looking mainly for positive attributes. However, when asked, "Which parent should be denied custody of the child?" they looked for negative attributes and the majority answered that Parent B should be denied custody, implying that Parent A should have custody.<ref name="shafir">{{Citation |last=Shafir |first=E. |year=1993 |title=Choosing versus rejecting: why some options are both better and worse than others |journal=Memory and Cognition |volume=21 |pages=546–556 |pmid= 8350746 |issue=4 |doi=10.3758/bf03197186|doi-access=free }} via {{Harvnb|Fine|2006|pp=63–65}}</ref> Similar studies have demonstrated how people engage in a biased search for information, but also that this phenomenon may be limited by a preference for genuine diagnostic tests. In an initial experiment, participants rated another person on the [[extroversion and introversion|introversion–extroversion]] personality dimension on the basis of an interview. They chose the interview questions from a given list. When the interviewee was introduced as an introvert, the participants chose questions that presumed introversion, such as, "What do you find unpleasant about noisy parties?" When the interviewee was described as extroverted, almost all the questions presumed extroversion, such as, "What would you do to liven up a dull party?" These [[loaded question]]s gave the interviewees little or no opportunity to falsify the hypothesis about them.<ref>{{Citation |last1=Snyder |first1=Mark | first2=William B. Jr. | last2=Swann |year=1978 |title=Hypothesis-testing processes in social interaction |journal=[[Journal of Personality and Social Psychology]] |volume=36 |issue=11 |pages=1202–1212 |doi=10.1037/0022-3514.36.11.1202}} via {{Harvnb|Poletiek|2001|p=131}}</ref> A later version of the experiment gave the participants less presumptive questions to choose from, such as, "Do you shy away from social interactions?"<ref name="kunda117" /> Participants preferred to ask these more diagnostic questions, showing only a weak bias towards positive tests. This pattern, of a main preference for diagnostic tests and a weaker preference for positive tests, has been replicated in other studies.<ref name="kunda117">{{Harvnb|Kunda|1999|pp=117–18}}</ref> Goedert, Ellefson, and Rehder (2014) examined the influence of prior distributions of the strength of causal relations on how people collect and evaluate evidence. The findings suggest that people's sense of plausibility will influence their search for evidence in a way that bolsters their prior views. In this experiment, participants read stories about a range of causes to other kinds of effect, for example, skin diseases to car accidents, and collected evidence of the probativeness of particular causes. They found that, on average, participants were more likely to search for confirming evidence for causes they concluded were plausible and disconfirming evidence for causes they considered implausible — a strategy the researchers dubbed the positive test strategy. This result implies that plausibility does not just change how people interpret evidence, but also what evidence they seek. Furthermore, the research indicated that in cases when participants perceived the cause as unlikely, one of their major concerns is to give disconfirming evidence preference, and because the explanation they modified is a source of evidence that contradicts their newly acquired explanation, it may be difficult for people to update their beliefs when faced with disconfirming evidence.<ref>Goedert, K. M., Ellefson, M. R., & Rehder, B. (2014). "Causal inference and evidence gathering: The influence of prior beliefs on causal reasoning." ''Psychological Science'', 25(4), 783-791.</ref> Personality traits influence and interact with biased search processes.<ref name=albarracin>{{Citation|last=Albarracin|first=D.|author2=Mitchell, A.L.|title=The role of defensive confidence in preference for proattitudinal information: How believing that one is strong can sometimes be a defensive weakness|journal=Personality and Social Psychology Bulletin|year=2004|volume=30|issue=12|pages=1565–1584|doi=10.1177/0146167204271180|pmid=15536240|pmc=4803283}}</ref> Individuals vary in their abilities to defend their attitudes from external attacks in relation to [[selective exposure theory|selective exposure]]. Selective exposure occurs when individuals search for information that is consistent, rather than inconsistent, with their personal beliefs.<ref>{{Citation|last=Fischer|first=P.|author2=Fischer, Julia K. |author3=Aydin, Nilüfer |author4= Frey, Dieter |title=Physically attractive social information sources lead to increased selective exposure to information|journal=Basic and Applied Social Psychology|year=2010|volume=32|issue=4|pages=340–347|doi=10.1080/01973533.2010.519208|s2cid=143133082}}</ref> An experiment examined the extent to which individuals could refute arguments that contradicted their personal beliefs.<ref name=albarracin /> People with high [[confidence]] levels more readily seek out contradictory information to their personal position to form an argument. This can take the form of an ''oppositional news consumption'', where individuals seek opposing partisan news in order to counterargue.<ref>{{cite book |last1=Dahlgren |first1=Peter M. |title=Media Echo Chambers: Selective Exposure and Confirmation Bias in Media Use, and its Consequences for Political Polarization |date=2020 |publisher=University of Gothenburg |location=Gothenburg |isbn=978-91-88212-95-5 |url=https://gupea.ub.gu.se/handle/2077/67023?locale=en |mode=cs2 |access-date=16 October 2021 |archive-date=6 April 2023 |archive-url=https://web.archive.org/web/20230406025447/https://gupea.ub.gu.se/handle/2077/67023?locale=en |url-status=live }}</ref> Individuals with low confidence levels do not seek out contradictory information and prefer information that supports their personal position. People generate and evaluate evidence in arguments that are biased towards their own beliefs and opinions.<ref name="stanovich"/> Heightened confidence levels decrease preference for information that supports individuals' personal beliefs. Another experiment gave participants a complex rule-discovery task that involved moving objects simulated by a computer.<ref name="mynatt1978">{{Citation |last1=Mynatt |first1=Clifford R. |first2=Michael E. |last2=Doherty |first3=Ryan D. |last3=Tweney |year=1978 |title=Consequences of confirmation and disconfirmation in a simulated research environment |journal=Quarterly Journal of Experimental Psychology |volume=30 |issue=3 |pages=395–406 |url=https://www.academia.edu/442226 |doi =10.1080/00335557843000007 |s2cid=145419628 }}</ref> Objects on the computer screen followed specific laws, which the participants had to figure out. So, participants could "fire" objects across the screen to test their hypotheses. Despite making many attempts over a ten-hour session, none of the participants figured out the rules of the system. They typically attempted to confirm rather than falsify their hypotheses, and were reluctant to consider alternatives. Even after seeing objective evidence that refuted their working hypotheses, they frequently continued doing the same tests. Some of the participants were taught proper hypothesis-testing, but these instructions had almost no effect.<ref name="mynatt1978" /> === Biased interpretation of information === {{Quote box |quote=Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.|source=—[[Michael Shermer]]<ref>{{Harvnb|Kida|2006|p=157}}</ref> |width=30em |align=right}} Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased. A team at [[Stanford University]] conducted an experiment involving participants who felt strongly about capital punishment, with half in favor and half against it.<ref name="lord1979" /><ref name=baron201 /> Each participant read descriptions of two studies: a comparison of [[U.S. state]]s with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study's procedure and had to rate whether the research was well-conducted and convincing.<ref name="lord1979" /> In fact, the studies were fictional. Half the participants were told that one kind of study supported the [[deterrence (psychology)|deterrent]] effect and the other undermined it, while for other participants the conclusions were swapped.<ref name="lord1979" /><ref name="baron201">{{Harvnb|Baron|2000|pp=201–202}}</ref> The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways.<ref name="lord1979" /><ref name="vyse122">{{Harvnb|Vyse|1997|p=122}}</ref> Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time," while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented."<ref name="lord1979" /> The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias", has been supported by other experiments.<ref name="taber_political" /> Another study of biased interpretation occurred during the [[2004 United States presidential election|2004 U.S. presidential election]] and involved participants who reported having strong feelings about the candidates. They were shown apparently contradictory pairs of statements, either from Republican candidate [[George W. Bush]], Democratic candidate [[John Kerry]] or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether each individual's statements were inconsistent.<ref name="westen2006">{{Citation|last1=Westen |first1=Drew |first2=Pavel S. |last2=Blagov |first3=Keith |last3=Harenski |first4=Clint |last4=Kilts |first5=Stephan |last5=Hamann |year=2006 |title=Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential election |journal=[[Journal of Cognitive Neuroscience]] |volume=18 |issue=11 |pages=1947–1958 |doi=10.1162/jocn.2006.18.11.1947 |pmid=17069484|citeseerx=10.1.1.578.8097 |s2cid=8625992 }}</ref>{{rp|1948}} There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory.<ref name="westen2006" />{{rp|1951}} [[File:MRI-Philips.JPG|thumb|right|alt=A large round machine with a hole in the middle, with a platter for a person to lie on so that their head can fit into the hole|An [[Magnetic resonance imaging|MRI scanner]] allowed researchers to examine how the human brain deals with dissonant information.]] In this experiment, the participants made their judgments while in a [[magnetic resonance imaging]] (MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate, [[emotion]]al centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the [[cognitive dissonance]] induced by reading about their favored candidate's irrational or [[Hypocrisy|hypocritical]] behavior.<ref name="westen2006" />{{rp|1956}} Biases in belief interpretation are persistent, regardless of intelligence level. Participants in an experiment took the [[SAT]] test (a college admissions test used in the United States) to assess their intelligence levels. They then read information regarding safety concerns for vehicles, and the experimenters manipulated the national origin of the car. American participants provided their opinion if the car should be banned on a six-point scale, where one indicated "definitely yes" and six indicated "definitely no". Participants firstly evaluated if they would allow a dangerous German car on American streets and a dangerous American car on German streets. Participants believed that the dangerous German car on American streets should be banned more quickly than the dangerous American car on German streets. There was no difference among intelligence levels at the rate participants would ban a car.<ref name="stanovich">{{Citation|last=Stanovich|first=K.E.|author2=West, R.F. |author3=Toplak, M.E. |s2cid=14505370|title=Myside bias, rational thinking, and intelligence|journal=Current Directions in Psychological Science|year=2013|volume=22|issue=4|pages=259–264 |doi=10.1177/0963721413480174}}</ref> Biased interpretation is not restricted to emotionally significant topics. In another experiment, participants were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character's guilt, they rated statements supporting that hypothesis as more important than conflicting statements.<ref>{{Citation |last1=Gadenne |first1=V. |first2=M. |last2=Oswald |year=1986 |title=Entstehung und Veränderung von Bestätigungstendenzen beim Testen von Hypothesen [Formation and alteration of confirmatory tendencies during the testing of hypotheses] |journal=Zeitschrift für Experimentelle und Angewandte Psychologie |volume=33 |pages=360–374}} via {{Harvnb|Oswald|Grosjean|2004|p=89}}</ref> === Biased recall of information === People may remember evidence selectively to reinforce their expectations, even if they gather and interpret evidence in a neutral manner. This effect is called "selective recall", "confirmatory memory", or "access-biased memory".<ref>{{Citation |last1=Hastie |first1=Reid |first2=Bernadette |last2=Park |chapter=The relationship between memory and judgment depends on whether the judgment task is memory-based or on-line |title=Social cognition: key readings |editor-first=David L. |editor-last=Hamilton |publisher=Psychology Press |location=New York |year=2005 |page=394 |isbn=978-0-86377-591-8 |oclc=55078722}}</ref> Psychological theories differ in their predictions about selective recall. [[Schema (psychology)|Schema theory]] predicts that information matching prior expectations will be more easily stored and recalled than information that does not match.<ref name=oswald88 /> Some alternative approaches say that surprising information stands out and so is memorable.<ref name="oswald88">{{Harvnb|Oswald|Grosjean|2004|pp=88–89}}</ref> Predictions from both these theories have been confirmed in different experimental contexts, with no theory winning outright.<ref>{{Citation |last1=Stangor |first1=Charles |first2=David |last2=McMillan |year=1992 |title= Memory for expectancy-congruent and expectancy-incongruent information: A review of the social and social developmental literatures |journal=Psychological Bulletin |volume=111 |issue=1 |pages=42–61 |doi=10.1037/0033-2909.111.1.42}}</ref> In one study, participants read a profile of a woman which described a mix of introverted and extroverted behaviors.<ref name="snydercantor" /> They later had to recall examples of her introversion and extroversion. One group was told this was to assess the woman for a job as a librarian, while a second group were told it was for a job in real estate sales. There was a significant difference between what these two groups recalled, with the "librarian" group recalling more examples of introversion and the "sales" groups recalling more extroverted behavior.<ref name="snydercantor">{{Citation |last1= Snyder |first1=M. |first2=N. |last2=Cantor |year=1979 |title=Testing hypotheses about other people: the use of historical knowledge |journal=Journal of Experimental Social Psychology |volume=15 |pages=330–342 |doi=10.1016/0022-1031(79)90042-8 |issue= 4}} via {{Harvnb|Goldacre|2008|p=231}}</ref> A selective memory effect has also been shown in experiments that manipulate the desirability of personality types.<ref name=oswald88 /><ref>{{Harvnb|Kunda|1999|pp=225–232}}</ref> In one of these, a group of participants were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated study, participants were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of participants provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.<ref>{{Citation |last1=Sanitioso |first1=Rasyid |first2=Ziva |last2=Kunda |first3=G.T. |last3=Fong |year=1990 |title= Motivated recruitment of autobiographical memories |journal=Journal of Personality and Social Psychology |issn=0022-3514 |volume=59 |issue=2 |pages=229–241 |doi= 10.1037/0022-3514.59.2.229 |pmid=2213492}}</ref> Changes in emotional states can also influence memory recall.<ref name=levine>{{Citation|last=Levine|first=L.|author2=Prohaska, V. |author3=Burgess, S.L. |author4=Rice, J.A. |author5=Laulhere, T.M. |title=Remembering past emotions: The role of current appraisals|journal=Cognition and Emotion|year=2001|volume=15|issue=4|pages=393–417|doi=10.1080/02699930125955|s2cid=22743423}}</ref><ref name=safer>{{Citation|last=Safer|first=M.A.|author2=Bonanno, G.A. |author3=Field, N. |title=It was never that bad: Biased recall of grief and long-term adjustment to the death of a spouse|journal=Memory|year=2001|volume=9|issue=3|pages=195–203|doi=10.1080/09658210143000065|pmid=11469313|s2cid=24729233}}</ref> Participants rated how they felt when they had first learned that [[O. J. Simpson]] had been acquitted of murder charges.<ref name="levine"/> They described their emotional reactions and confidence regarding the verdict one week, two months, and one year after the trial. Results indicated that participants' assessments for Simpson's guilt changed over time. The more that participants' opinion of the verdict had changed, the less stable were the participant's memories regarding their initial emotional reactions. When participants recalled their initial emotional reactions two months and a year later, past appraisals closely resembled current appraisals of emotion. People demonstrate sizable myside bias when discussing their opinions on controversial topics.<ref name="stanovich"/> Memory recall and construction of experiences undergo revision in relation to corresponding emotional states. Myside bias has been shown to influence the accuracy of memory recall.<ref name="safer"/> In an experiment, widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses. Participants noted a higher experience of grief at six months rather than at five years. Yet, when the participants were asked after five years how they had felt six months after the death of their significant other, the intensity of grief participants recalled was highly [[correlation|correlated]] with their current level of grief. Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events.<ref name=levine /> Emotional memories are reconstructed by current emotional states. One study showed how selective memory can maintain belief in [[extrasensory perception]] (ESP).<ref name="russell_jones">{{Citation |last1=Russell |first1=Dan |first2=Warren H. |last2=Jones |year=1980 |title=When superstition fails: Reactions to disconfirmation of paranormal beliefs |journal=Personality and Social Psychology Bulletin |volume=6 |issue=1 |pages= 83–88 |issn=1552-7433 |doi=10.1177/014616728061012|s2cid=145060971 }} via {{Harvnb|Vyse|1997|p=121}}</ref> Believers and disbelievers were each shown descriptions of ESP experiments. Half of each group were told that the experimental results supported the existence of ESP, while the others were told they did not. In a subsequent test, participants recalled the material accurately, apart from believers who had read the non-supportive evidence. This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP.<ref name="russell_jones" />
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)