Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Confirmation bias
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Biased interpretation of information === {{Quote box |quote=Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.|source=—[[Michael Shermer]]<ref>{{Harvnb|Kida|2006|p=157}}</ref> |width=30em |align=right}} Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased. A team at [[Stanford University]] conducted an experiment involving participants who felt strongly about capital punishment, with half in favor and half against it.<ref name="lord1979" /><ref name=baron201 /> Each participant read descriptions of two studies: a comparison of [[U.S. state]]s with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study's procedure and had to rate whether the research was well-conducted and convincing.<ref name="lord1979" /> In fact, the studies were fictional. Half the participants were told that one kind of study supported the [[deterrence (psychology)|deterrent]] effect and the other undermined it, while for other participants the conclusions were swapped.<ref name="lord1979" /><ref name="baron201">{{Harvnb|Baron|2000|pp=201–202}}</ref> The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways.<ref name="lord1979" /><ref name="vyse122">{{Harvnb|Vyse|1997|p=122}}</ref> Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time," while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented."<ref name="lord1979" /> The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias", has been supported by other experiments.<ref name="taber_political" /> Another study of biased interpretation occurred during the [[2004 United States presidential election|2004 U.S. presidential election]] and involved participants who reported having strong feelings about the candidates. They were shown apparently contradictory pairs of statements, either from Republican candidate [[George W. Bush]], Democratic candidate [[John Kerry]] or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether each individual's statements were inconsistent.<ref name="westen2006">{{Citation|last1=Westen |first1=Drew |first2=Pavel S. |last2=Blagov |first3=Keith |last3=Harenski |first4=Clint |last4=Kilts |first5=Stephan |last5=Hamann |year=2006 |title=Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential election |journal=[[Journal of Cognitive Neuroscience]] |volume=18 |issue=11 |pages=1947–1958 |doi=10.1162/jocn.2006.18.11.1947 |pmid=17069484|citeseerx=10.1.1.578.8097 |s2cid=8625992 }}</ref>{{rp|1948}} There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory.<ref name="westen2006" />{{rp|1951}} [[File:MRI-Philips.JPG|thumb|right|alt=A large round machine with a hole in the middle, with a platter for a person to lie on so that their head can fit into the hole|An [[Magnetic resonance imaging|MRI scanner]] allowed researchers to examine how the human brain deals with dissonant information.]] In this experiment, the participants made their judgments while in a [[magnetic resonance imaging]] (MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate, [[emotion]]al centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the [[cognitive dissonance]] induced by reading about their favored candidate's irrational or [[Hypocrisy|hypocritical]] behavior.<ref name="westen2006" />{{rp|1956}} Biases in belief interpretation are persistent, regardless of intelligence level. Participants in an experiment took the [[SAT]] test (a college admissions test used in the United States) to assess their intelligence levels. They then read information regarding safety concerns for vehicles, and the experimenters manipulated the national origin of the car. American participants provided their opinion if the car should be banned on a six-point scale, where one indicated "definitely yes" and six indicated "definitely no". Participants firstly evaluated if they would allow a dangerous German car on American streets and a dangerous American car on German streets. Participants believed that the dangerous German car on American streets should be banned more quickly than the dangerous American car on German streets. There was no difference among intelligence levels at the rate participants would ban a car.<ref name="stanovich">{{Citation|last=Stanovich|first=K.E.|author2=West, R.F. |author3=Toplak, M.E. |s2cid=14505370|title=Myside bias, rational thinking, and intelligence|journal=Current Directions in Psychological Science|year=2013|volume=22|issue=4|pages=259–264 |doi=10.1177/0963721413480174}}</ref> Biased interpretation is not restricted to emotionally significant topics. In another experiment, participants were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character's guilt, they rated statements supporting that hypothesis as more important than conflicting statements.<ref>{{Citation |last1=Gadenne |first1=V. |first2=M. |last2=Oswald |year=1986 |title=Entstehung und Veränderung von Bestätigungstendenzen beim Testen von Hypothesen [Formation and alteration of confirmatory tendencies during the testing of hypotheses] |journal=Zeitschrift für Experimentelle und Angewandte Psychologie |volume=33 |pages=360–374}} via {{Harvnb|Oswald|Grosjean|2004|p=89}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)