|
|
|
|
|
|
|
|
Mock trials allow
researchers to examine confirmation biases in a
realistic setting. |
Confirmation Bias
Confirmation bias is the tendency to search for,
interpret, favor, and recall information in a way that
confirms or supports one's beliefs or values.
People display this bias when they select information
that supports their views, ignoring contrary
information, or when they interpret ambiguous evidence
as supporting their existing attitudes. The effect is
strongest for desired outcomes, for emotionally charged
issues, and for deeply entrenched beliefs. Confirmation
bias cannot be eliminated entirely, but it can be
managed, for example, by education and training in
critical thinking skills.
Definition and context
Confirmation bias, a phrase coined by English
psychologist Peter Wason, is the tendency of people to
favor information that confirms or strengthens their
beliefs or values, and is difficult to dislodge once
affirmed. Confirmation bias is an example of a cognitive
bias.
Confirmation bias (or confirmatory bias) has also been
termed myside bias. "Congeniality bias" has also been
used.
Confirmation biases are effects in information
processing. They differ from what is sometimes called
the behavioral confirmation effect, commonly known as
self-fulfilling prophecy, in which a person's
expectations influence their own behavior, bringing
about the expected result. |
|
Some psychologists restrict the term "confirmation bias"
to selective collection of evidence that supports what
one already believes while ignoring or rejecting
evidence that supports a different conclusion. Others
apply the term more broadly to the tendency to preserve
one's existing beliefs when searching for evidence,
interpreting it, or recalling it from memory.
Confirmation bias is a result of automatic,
unintentional strategies rather than deliberate
deception. Confirmation bias cannot be avoided or
eliminated entirely, but only managed by improving
education and critical thinking skills.
Confirmation bias is a broad construct that has a number
of possible explanations, namely: hypothesis-testing by
falsification, hypothesis testing by positive test
strategy, and information processing explanations. |
|
Types of confirmation bias
Biased search for
information
Experiments have found repeatedly that people tend to
test hypotheses in a one-sided way, by searching for
evidence consistent with their current hypothesis.
Rather than searching through all the relevant evidence,
they phrase questions to receive an affirmative answer
that supports their theory. They look for the
consequences that they would expect if their hypothesis
were true, rather than what would happen if they were
false. For example, someone using yes/no questions to
find a number they suspect to be the number 3 might ask,
"Is it an odd number?" People prefer this type of
question, called a "positive test", even when a negative
test such as "Is it an even number?" would yield exactly
the same information. However, this does not mean that
people seek tests that guarantee a positive answer. In
studies where subjects could select either such
pseudo-tests or genuinely diagnostic ones, they favored
the genuinely diagnostic.
Biased interpretation of
information
Confirmation biases are not limited to the collection of
evidence. Even if two individuals have the same
information, the way they interpret it can be biased.
A team at Stanford University conducted an experiment
involving participants who felt strongly about capital
punishment, with half in favor and half against it. Each
participant read descriptions of two studies: a
comparison of U.S. states with and without the death
penalty, and a comparison of murder rates in a state
before and after the introduction of the death penalty.
After reading a quick description of each study, the
participants were asked whether their opinions had
changed. Then, they read a more detailed account of each
study's procedure and had to rate whether the research
was well-conducted and convincing. In fact, the studies
were fictional. Half the participants were told that one
kind of study supported the deterrent effect and the
other undermined it, while for other participants the
conclusions were swapped.
The participants, whether supporters or opponents,
reported shifting their attitudes slightly in the
direction of the first study they read. Once they read
the more detailed descriptions of the two studies, they
almost all returned to their original belief regardless
of the evidence provided, pointing to details that
supported their viewpoint and disregarding anything
contrary. Participants described studies supporting
their pre-existing view as superior to those that
contradicted it, in detailed and specific ways. Writing
about a study that seemed to undermine the deterrence
effect, a death penalty proponent wrote, "The research
didn't cover a long enough period of time," while an
opponent's comment on the same study said, "No strong
evidence to contradict the researchers has been
presented." The results illustrated that people set
higher standards of evidence for hypotheses that go
against their current expectations. This effect, known
as "disconfirmation bias", has been supported by other
experiments.
Biased memory recall of
information
People may remember evidence selectively to reinforce
their expectations, even if they gather and interpret
evidence in a neutral manner. This effect is called
"selective recall", "confirmatory memory", or
"access-biased memory". Psychological theories differ in
their predictions about selective recall. Schema theory
predicts that information matching prior expectations
will be more easily stored and recalled than information
that does not match. Some alternative approaches say
that surprising information stands out and so is
memorable. Predictions from both these theories have
been confirmed in different experimental contexts, with
no theory winning outright. |
|
Individual differences
Myside bias was once believed to be correlated with
intelligence; however, studies have shown that myside
bias can be more influenced by ability to rationally
think as opposed to level of intelligence. Myside bias
can cause an inability to effectively and logically
evaluate the opposite side of an argument. Studies have
stated that myside bias is an absence of "active
open-mindedness", meaning the active search for why an
initial idea may be wrong. Typically, myside bias is
operationalized in empirical studies as the quantity of
evidence used in support of their side in comparison to
the opposite side. |
|
Wikipedia: Confirmation Bias |
|
|
|
|
|
|
|
|
Search Fun Easy English |
|
|
|
|
|
|
|
About
Contact
Copyright
Resources
Site Map |