Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Confirmation bias
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Information processing explanations == There are currently three main [[Information processing (psychology)|information processing]] explanations of confirmation bias, plus a recent addition. === Cognitive versus motivational === [[File:Felicidade A very happy boy.jpg|thumb|250x250px|Happy events are more likely to be remembered.]] According to [[Robert MacCoun]], most biased evidence processing occurs through a combination of "cold" (cognitive) and "hot" (motivated) mechanisms.<ref>{{Harvnb|MacCoun|1998}}</ref> Cognitive explanations for confirmation bias are based on limitations in people's ability to handle complex tasks, and the shortcuts, called ''[[heuristics in judgment and decision making|heuristics]]'', that they use.<ref>{{Harvnb|Friedrich|1993|p=298}}</ref> For example, people may judge the reliability of evidence by using the ''[[availability heuristic]]'' that is, how readily a particular idea comes to mind.<ref>{{Harvnb|Kunda|1999|p=94}}</ref> It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel.<ref name ="nickerson"/>{{rp|198β199}} Another heuristic is the positive test strategy identified by Klayman and Ha, in which people test a hypothesis by examining cases where they expect a property or event to occur. This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be. However, it is not universally reliable, so people can overlook challenges to their existing beliefs.<ref name="klaymanha" /><ref name ="nickerson"/>{{rp|200}} Motivational explanations involve an effect of [[desire (emotion)|desire]] on [[belief]].<ref name ="nickerson"/>{{rp|197}}<ref>{{Harvnb|Baron|2000|p=206}}</ref> It is known that people prefer positive thoughts over negative ones in a number of ways: this is called the "[[Pollyanna principle]]".<ref>{{Citation |last=Matlin |first=Margaret W. |title=Cognitive illusions: A handbook on fallacies and biases in thinking, judgement and memory |editor-first=RΓΌdiger F. |editor-last=Pohl |publisher=[[Psychology Press]] |location=Hove, UK |year=2004 |pages=[https://archive.org/details/cognitiveillusio0000unse/page/255 255β272] |chapter=Pollyanna Principle |isbn=978-1-84169-351-4 |oclc=55124398 |chapter-url=https://archive.org/details/cognitiveillusio0000unse/page/255}}</ref> Applied to [[argument]]s or sources of [[evidence]], this could explain why desired conclusions are more likely to be believed true. According to experiments that manipulate the desirability of the conclusion, people demand a high standard of evidence for unpalatable ideas and a low standard for preferred ideas. In other words, they ask, "Can I believe this?" for some suggestions and, "Must I believe this?" for others.<ref>{{Citation|last1=Dawson |first1=Erica |first2=Thomas |last2=Gilovich |author-link2=Thomas Gilovich |first3=Dennis T. |last3=Regan |date=October 2002 |title=Motivated reasoning and performance on the Wason Selection Task |journal=Personality and Social Psychology Bulletin |volume=28 |issue=10 |pages=1379β1387 |doi=10.1177/014616702236869|s2cid=143957893 }}</ref><ref>{{Citation |last1=Ditto |first1=Peter H. |first2= David F. |last2=Lopez |year=1992 |title=Motivated skepticism: Use of differential decision criteria for preferred and nonpreferred conclusions |journal=[[Journal of Personality and Social Psychology]] |volume=63 |issue=4 |pages=568β584 |issn=0022-3514 |doi=10.1037/0022-3514.63.4.568}}</ref> Although [[consistency]] is a desirable feature of attitudes, an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new, surprising information. Social psychologist [[Ziva Kunda]] combines the cognitive and motivational theories, arguing that motivation creates the bias, but cognitive factors determine the size of the effect.<ref name ="nickerson"/>{{rp|198}} === Cost-benefit === Explanations in terms of [[cost-benefit analysis]] assume that people do not just test hypotheses in a disinterested way, but assess the costs of different errors.<ref>{{Harvnb|Oswald|Grosjean|2004|pp=91β93}}</ref> Using ideas from [[evolutionary psychology]], James Friedrich suggests that people do not primarily aim at [[truth]] in testing hypotheses, but try to avoid the most costly errors. For example, employers might ask one-sided questions in job interviews because they are focused on weeding out unsuitable candidates.<ref>{{Harvnb|Friedrich|1993|pp=299, 316β317}}</ref> [[Yaacov Trope]] and Akiva Liberman's refinement of this theory assumes that people compare the two different kinds of error: accepting a false hypothesis or rejecting a true hypothesis. For instance, someone who underestimates a friend's honesty might treat him or her suspiciously and so undermine the friendship. Overestimating the friend's honesty may also be costly, but less so. In this case, it would be rational to seek, evaluate or remember evidence of their honesty in a biased way.<ref>{{Citation |last1=Trope |first1=Y. |first2=A. |last2=Liberman |title=Social psychology: Handbook of basic principles |editor1-first=E. Tory |editor1-last=Higgins |editor2-first=Arie W. |editor2-last=Kruglanski |publisher=Guilford Press |location=New York |year=1996 |chapter=Social hypothesis testing: Cognitive and motivational mechanisms |isbn=978-1-57230-100-9 |oclc=34731629}} via {{Harvnb |Oswald|Grosjean|2004|pp=91β93}}</ref> When someone gives an initial impression of being introverted or extroverted, questions that match that impression come across as more [[empathic]].<ref name=dardenne/> This suggests that when talking to someone who seems to be an introvert, it is a sign of better [[social skills]] to ask, "Do you feel awkward in social situations?" rather than, "Do you like noisy parties?" The connection between confirmation bias and social skills was corroborated by a study of how college students get to know other people. Highly [[self-monitoring]] students, who are more sensitive to their environment and to [[social norms]], asked more matching questions when interviewing a high-status staff member than when getting to know fellow students.<ref name="dardenne">{{Citation |last1=Dardenne |first1=Benoit |first2=Jacques-Philippe |last2=Leyens |title=Confirmation bias as a social skill |journal=[[Personality and Social Psychology Bulletin]] |year=1995 |volume=21 |issue=11 |pages=1229β1239 |doi=10.1177/01461672952111011 |s2cid=146709087 |issn=1552-7433 |url=https://orbi.uliege.be/bitstream/2268/28639/1/dardenne%26leyens_pspb_95.pdf |access-date=25 September 2019 |archive-date=9 September 2020 |archive-url=https://web.archive.org/web/20200909121223/https://orbi.uliege.be/bitstream/2268/28639/1/dardenne%26leyens_pspb_95.pdf |url-status=live }}</ref> === Exploratory versus confirmatory === Psychologists [[Jennifer Lerner]] and [[Philip Tetlock]] distinguish two different kinds of thinking process. ''[[Exploratory thought]]'' neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while ''confirmatory thought'' seeks to justify a specific point of view. Lerner and Tetlock say that when people expect to justify their position to others whose views they already know, they will tend to adopt a similar position to those people, and then use confirmatory thought to bolster their own credibility. However, if the external parties are overly aggressive or critical, people will disengage from thought altogether, and simply assert their personal opinions without justification. Lerner and Tetlock say that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well-informed, genuinely interested in the truth, and whose views they do not already know. Because those conditions rarely exist, they argue, most people are using confirmatory thought most of the time.<ref>{{Citation|editor= Sandra L. Schneider|title=Emerging perspectives on judgment and decision research|year=2003|publisher=[[Cambridge University Press]]|location=Cambridge [u. a.]|isbn=978-0-521-52718-7|page=445|author=Shanteau, James}}</ref><ref>{{Citation|last=Haidt|first=Jonathan|title=The righteous mind: Why good people are divided by politics and religion|year=2013|publisher=Penguin Books|location=London|isbn=978-0-14-103916-9|pages=87β88}}</ref><ref>{{Citation|editor1-first=Susan T.|editor1-last=Fiske|editor2-first=Daniel T.|editor2-last=Gilbert|editor3-first=Gardner|editor3-last=Lindzey|title=The handbook of social psychology|year=2010|publisher=[[Wiley (publisher)|Wiley]]|location=Hoboken, NJ|isbn=978-0-470-13749-9|page=[https://archive.org/details/handbookofsocial5ed2unse/page/811 811]|edition=5th|url=https://archive.org/details/handbookofsocial5ed2unse/page/811}}</ref> === Make-believe === Developmental psychologist Eve Whitmore has argued that beliefs and biases involved in confirmation bias have their roots in childhood coping through make-believe, which becomes "the basis for more complex forms of self-deception and illusion into adulthood." The friction brought on by questioning as an adolescent with developing critical thinking can lead to the rationalization of false beliefs, and the habit of such rationalization can become unconscious over the years.<ref name = "apa">{{cite journal |author1=American Psychological Association |title=Why we're susceptible to fake news β and how to defend against it |journal=[[Skeptical Inquirer]] |date=2018 |volume=42 |issue=6 |pages=8β9 |mode=cs2}}</ref> === Optimal information acquisition === Recent research in economics has challenged the traditional view of confirmation bias as purely a cognitive flaw.<ref name = "oi">{{cite web |last=Page |first=Lionel |date=2023-06-14 |title=Reassessing the Confirmation Bias: Is it a flaw or an efficient strategy? |url=https://optimallyirrational.substack.com/p/reassessing-the-confirmation-bias |website=Optimally Irrational |access-date=2024-10-13 }}</ref> Under conditions where acquiring and processing information is costly, seeking confirmatory evidence can actually be an optimal strategy. Instead of pursuing contrarian or disconfirming evidence, it may be more efficient to focus on sources likely to align with one's existing beliefs, given the constraints on time and resources. Economist Weijie Zhong has developed a model demonstrating that individuals who must make decisions under time pressure, and who face costs for obtaining more information, will often prefer confirmatory signals. According to this model, when individuals believe strongly in a certain hypothesis, they optimally seek information that confirms it, allowing them to build confidence more efficiently. If the expected confirmatory signals are not received, their confidence in the initial hypothesis will gradually decline, leading to belief updating. This approach shows that seeking confirmation is not necessarily biased but may be a rational allocation of limited attention and resources.<ref name = "zhong">{{cite journal |last=Zhong |first=Weijie |year=2022 |title=Optimal dynamic information acquisition |journal=Econometrica |volume=90 |issue=4 |pages=1537β1582 |doi=10.3982/ECTA17668 |doi-broken-date=5 February 2025 }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)