What is belief bias in human behavior

Cognitive bias - Cognitive bias

Systematic pattern of deviation from the norm or rationality in the assessment

A cognitive bias is a systematic pattern of deviation from the norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. The construction of an individual's reality, not the objective input, can determine his behavior in the world. Therefore, cognitive biases can sometimes lead to distortions of perception, imprecise judgment, illogical interpretation, or general irrationality.

Although it may appear that such misperceptions are aberrations, prejudices can help people to find common ground and shortcuts to aid navigation in common life situations.

Some cognitive biases are believed to be adaptive. Cognitive biases can lead to more effective actions in a given context. In addition, allowing cognitive biases enables faster decisions, which can be desirable when timeliness is worth more than accuracy, as shown in the heuristic. Other cognitive prejudices are a "by-product" of human processing limitations, due to a lack of appropriate mental mechanisms (limited rationality), effects of the constitution and biological state of the individual (see embodied cognition), or simply a limited ability to process information.

Over the past six decades of research into human judgment and decision-making in the fields of cognitive science, social psychology, and behavioral economics, an ever-evolving list of cognitive biases has been identified. Daniel Kahneman and Tversky (1996) argue that cognitive biases have efficient practical effects on areas such as clinical judgment, entrepreneurship, finance, and management.


The concept of cognitive prejudice was introduced in 1972 by Amos Tversky and Daniel Kahneman and arose from their experience with the Innumerable of people or the inability to intuitively argue with the larger orders of magnitude. Tversky, Kahneman, and colleagues demonstrated several reproducible ways in which human judgments and decisions differ from rational choice theory. Tversky and Kahneman explained human differences in judging and making decisions about heuristics. Heuristics involve mental shortcuts that provide quick estimates of the possibility of uncertain events. Heuristics are easy for the brain to compute, but sometimes lead to "serious and systematic errors".

For example, the representativeness heuristic is defined as "the tendency to judge the frequency or probability of an event" on the basis of which the event "resembles the typical case".

The "Linda problem" illustrates the representativity heuristic (Tversky & Kahneman, 1983). Attendees were given a description of "Linda", suggesting that Linda might be a feminist (e.g. alleged to be concerned about discrimination and social justice). They were then asked whether they thought Linda was more like (a) a "bank clerk" or (b) a "bank clerk active in the feminist movement". A majority chose the answer (b). This error (mathematically speaking, answer (b) cannot be more likely than answer (a)) is an example of the "conjunctural error"; Tversky and Kahneman argued that respondents chose (b) because it seemed "more representative" or typical of people who might match Linda's description. The representativeness heuristic can lead to errors such as the activation of stereotypes and imprecise judgments by others (Haselton et al., 2005, p. 726).

Alternatively, critics of Kahneman and Tversky such as Gerd Gigerenzer argued that heuristics should not lead us to understand human thinking as being riddled with irrational cognitive prejudices. Rather, you should understand rationality as an adaptive tool that is not identical to the rules of formal logic or probability. However, experiments such as the "Linda Problem" evolved into heuristics and skewed research programs that expanded beyond academic psychology to other disciplines such as medicine and political science.


Prejudices can be differentiated in several dimensions. For a more complete list, see List of Cognitive Biases. Examples of cognitive biases are:

  • Preloads specific to groups (like the risky postponement) versus biases on an individual level.
  • Prejudices that affect decision-making where the desirability of options must be taken into account (e.g. fallacy in the case of sunk costs).
  • Prejudices such as illusory correlations influence the Evaluation, how likely something is or whether one thing is the cause of another.
  • Prejudices affecting the memory affect, such as B. Consistency bias (remembering previous attitudes and behaviors that are more similar to current attitudes).
  • Prejudices that the motivation reflect a subject's desire for a positive self-image that leads to egocentric prejudice and the avoidance of uncomfortable cognitive dissonances.

Other prejudices are due to the particular way the brain perceives, forms memories, and makes judgments. This distinction is sometimes described as "hot cognition" versus "cold cognition", since motivated thinking can involve a state of arousal. Under the "cold" prejudices

  • Some are due to the fact that relevant information is ignored (e.g. neglect of probability).
  • Some involve a decision or judgment, that is influenced by irrelevant information (For example, the framework effect, where the same problem receives different answers depending on the description, or the discrimination bias, where decisions that are presented together have different outcomes than those that are presented separately), and
  • others give an unimportant but salient feature of the problem (e.g. anchoring) excessive weight .

The fact that some prejudices reflect motivation, especially the motivation to have positive attitudes towards oneself, explains the fact that many prejudices are selfish or self-directed (e.g., illusion of asymmetrical insight, selfish bias). There are also prejudices regarding the evaluation of in-groups or out-groups by test persons. Evaluation of in-groups as more diverse and in many respects "better", even if these groups are defined arbitrarily (ingroup bias, outgroup homogenity bias).

Some cognitive biases belong to the subgroup of attention bias, which refers to paying more attention to certain stimuli. For example, it has been shown that alcohol and drug addicts pay more attention to drug-related stimuli. Common psychological tests to measure these distortions are the Stroop task and the point probe task.

People's susceptibility to certain types of cognitive distortions can be measured using the Cognitive Reflection Test (CRT) developed by Shane Frederick (2005).

List of prejudices

The following is a list of the most commonly studied cognitive biases:

Surname description
Fundamental mapping error (FAE) Also known as correspondence bias, people tend to overemphasize personality explanations for behaviors observed in others. At the same time, individuals underestimate the role and power of situational influences on the same behavior. The classic study by Edward E. Jones and Victor A. Harris (1967) illustrates the FAE. Although the participants were made aware that the language direction of the target (Pro-Castro / Anti-Castro) was assigned to the author, they ignored the pressure of the situation and ascribed pro-Castro attitudes to the author if the speech represented such attitudes.
Unconscious bias An implicit assignment of positive or negative characteristics to a group of people.
Primer preload The tendency to be influenced by what someone else said to create a preconceived idea.
Confirmation failure The tendency to seek or interpret information in a way that confirms one's own prejudices. In addition, individuals can discredit information that does not support their views. Confirmation bias is related to the concept of cognitive dissonance, as individuals can reduce inconsistency by looking for information that confirms their views (Jermias, 2001, p. 146).
Affinity bias The unconscious tendency to be positive towards people like us
Selfish bias The unconscious tendency to take more responsibility for success than failure. It can also manifest as a tendency for people to evaluate ambiguous information in a way that benefits their interests.
Belief bias When the assessment of the logical strength of an argument is skewed by belief in the truth or falsehood of the conclusion.
Framing Use too narrow an approach and describe the situation or problem.
Hindsight failure Sometimes referred to as the "I-knew-it-all" effect, there is a tendency to view past events as predictable.
Embodied knowledge A tendency towards selectivity in terms of perception, attention, decision making and motivation based on the biological state of the body.
anchoring Anchoring bias is defined as the inability of individuals to make reasonable adjustments from a starting point in order to arrive at a definitive answer. Anchoring prejudices can lead to people making less than optimal decisions. The anchoring influences decision-making, for example in negotiations, medical diagnoses and including judicial convictions.
Status quo The status quo bias is an implication of loss aversion. In status quo bias, a decision maker has an increased tendency to choose an alternative because it is a standard option or a status quo. The status quo bias has been shown to influence several key economic decisions, such as choosing auto insurance or an electrical service.
Superconscious Superconscious is the situation in which people tend to trust their ability to make too many right decisions. They tend to overestimate their abilities and skills as decision makers.

Practical meaning

Many social institutions rely on individuals to make rational judgments.

The securities settlement system largely assumes that all investors are acting as perfectly rational persons. The truth is that actual investors are faced with cognitive limitations due to biases, heuristics, and framework effects.

For example, a fair trial requires the jury to ignore irrelevant features of the case, weigh the relevant features appropriately, consider various ways of being open-minded, and resist fallacies such as appeals to emotions. The various prejudices that have been shown in these psychological experiments suggest that often people are unable to do all of these things. However, they don't do this in a systematic, directional way that is predictable.

Cognitive prejudices also relate to the persistence of thinking the theory of everything, major social problems such as prejudice, and hindering public acceptance of scientific, non-intuitive knowledge.

However, in some academic disciplines, the study of bias is very popular. For example, bias is a widespread and well-studied phenomenon as most decisions that affect the mind and heart of entrepreneurs are computationally intensive.

Cognitive bias can cause other problems that arise in daily life. One study showed the association between cognitive bias, especially proximity bias, and inhibitory control over how many unhealthy snacks a person would eat. They found that the participants who ate more of the unhealthy snacks tended to have less inhibitory control and relied more on proximity bias. Others have also hypothesized that cognitive biases might be related to various eating disorders and how people see their bodies and body image.

It has also been argued that cognitive biases can be used in destructive ways. Some believe that there are authority figures out there who use cognitive bias and heuristics to manipulate others so they can achieve their end goals. Some drugs and other health care treatments rely on cognitive bias to convince others who are prone to cognitive bias to use their products. Many see this as taking advantage of the natural struggle for judgment and decision-making. They also believe that it is the government's responsibility to regulate these misleading ads.

Cognitive prejudices also seem to play a role in the price and value of real estate sales. The test participants were shown a residential property. After that they were shown another quality that was completely independent of the first one. They were asked to say what they thought the value and selling price of the second property would be. They found that showing an unrelated property for participants had an impact on how they rated the second property.

To reduce

Because they cause systematic errors, cognitive biases cannot be compensated for with the wisdom of the crowd technique of averaging responses from several people. Debiasing is the reduction of bias in judgment and decision-making through incentives, nudges, and training. Cognitive bias mitigation and cognitive distortion modification are forms of bias reduction parameters specifically applicable to cognitive distortions and their effects. Predicting reference classes is a method of systematically debiasing estimates and decisions based on what Daniel Kahneman called the outside view.

Similar to Gigerenzer (1996) Haselton et al. (2005) state that the content and direction of cognitive distortions are not "arbitrary" (p. 730). In addition, cognitive biases can be controlled. One debiasing technique aims to reduce bias by encouraging individuals to use controlled processing as opposed to automatic processing. In relation to reducing the FAE, monetary incentives and informing participants that they will be held accountable for their attributions have been linked to increasing the exact attributions. The training has also shown that it reduces cognitive bias. Carey K. Morewedge and colleagues (2015) found that research participants who were exposed to one-time training interventions such as B. Educational videos and debiasing games teaching mitigating strategies immediately and up to three months later showed a significant reduction in their commission for six cognitive biases.

Cognitive bias modification refers to the process of modifying cognitive biases in healthy people and also refers to a growing area of ​​psychological (non-pharmaceutical) therapies for anxiety, depression, and addiction called cognitive bias modification therapy (CBMT). CBMT is a subset of therapies in a growing field of psychological therapies based on modifying cognitive processes with or without concomitant medication and talk therapy, sometimes referred to as applied cognitive processing therapies (ACPT). Although the modification of cognitive bias may refer to the modification of cognitive processes in healthy individuals, CBMT is a growing area of ​​evidence-based psychological therapy in which cognitive processes are modified to alleviate suffering from major depression, anxiety, and addiction. CBMT techniques are technology-assisted therapies that are administered via a computer with or without the assistance of a doctor. CBM combines evidence and theory from the cognitive model of fear, cognitive neuroscience, and attention models.

The cognitive bias modification has also been used to help people suffering from obsessive-compulsive beliefs and obsessive-compulsive disorder. This therapy has been shown to decrease compulsive beliefs and behaviors.

Common theoretical causes of some cognitive biases

The distortion arises from various processes that are sometimes difficult to distinguish. These include:

  • Limited rationality - limits to optimization and rationality
  • Attribute substitution - making a complex, difficult judgment by subconsciously replacing it with a simpler judgment
  • Attribution theory
  • Cognitive dissonance and related:
  • Links to information processing (heuristics), including:
    • Availability Heuristic - Estimating what is more likely, what is more available in memory, targeting vivid, unusual, or emotionally charged examples
    • Representativeness heuristic - assessment of probabilities based on similarity
    • Affect heuristic - a decision is based on an emotional response rather than a calculation of risks and benefits
  • Emotional and moral motivations arise, for example, from:
  • Self-observation illusion
  • Misinterpretation or misuse of statistics; Innumerable.
  • Social influence
  • The brain's limited information processing capacity
  • Noisy information processing (distortion when storing and retrieving from memory). For example, an article in the Psychological Bulletin 2012 proposed that at least eight apparently unrelated distortions can be generated by the same information-theoretical generative mechanism. The article shows that noisy deviations in memory-based information processes that convert objective evidence (observations) into subjective estimates (decisions) can lead to regressive conservatism, belief revision (Bayesian conservatism), illusory correlations, and illusory superiority (above-average effect) and above-average effect, subadditivity effect, exaggerated expectation, overconsciousness and the hard-easy effect.

Individual differences in cognitive prejudices

People seem to have stable individual differences in their susceptibility to decision bias such as overconsciousness, temporal discounting, and biased blind spots. However, these stable prejudices within individuals can change. Participants in experiments who watched training videos and played debiasing games immediately and up to three months later showed moderate to large reductions in how they were prone to six cognitive biases: anchoring, biased blind spot, confirmatory bias, fundamental attribution error, Projection bias and representativeness.

Individual differences in cognitive bias have also been associated with different levels of cognitive ability and function. The Cognitive Reflection Test (CRT) was used to understand the relationship between cognitive biases and cognitive abilities. There were inconclusive results using the cognitive reflection test to understand ability. There appears to be a correlation, however; Those who score higher on the cognitive reflection test have higher cognitive skills and rational thinking. This, in turn, helps predict performance on cognitive biases and heuristic tests. Individuals with higher CRT scores are usually able to respond more correctly to various heuristic and cognitive bias tests and tasks.

Age is another individual difference that affects one's ability to be prone to cognitive bias. Older people tend to be more prone to cognitive bias and have less cognitive flexibility. However, older people were able to reduce their susceptibility to cognitive biases during ongoing studies. These experiments had both young and older adults doing a framework task. Younger adults had more cognitive flexibility than older adults. Cognitive flexibility is associated with overcoming prejudices.


Criticism of theories of cognitive bias is usually based on the fact that both sides of a debate often assert that the other's thoughts are subject to human nature and the result of cognitive bias, while claiming that their own point of view is above cognitive bias and the right one Way to "overcome" the problem. This gap is linked to a more fundamental problem resulting from a lack of consensus in this area, which creates arguments that cannot be wrongly used to validate conflicting viewpoints.

Gerd Gigerenzer is one of the main opponents of cognitive prejudice and heuristics. Gigerenzer believes that cognitive prejudices are not prejudices, but rules of thumb, or as he would put it, “gut feelings” that can actually help us make accurate decisions in our lives. His view throws a much more positive light on cognitive bias than many other researchers. Many view cognitive biases and heuristics as irrational ways of making decisions and judgments. Gigerenzer argues that the use of heuristics and cognitive biases is rational and helpful in making decisions in our daily lives.

See also

  • Baconian Method § Idols of the Spirit ( Idola Mentis ) - investigation process
  • Cognitive Bias Mitigation - Reduce the negative effects of cognitive biases
  • Modification of the cognitive bias
  • Cognitive Dissonance - Mental stress that results from multiple conflicting beliefs, ideas, or values ​​held at the same time
  • Cognitive bias - Exaggerated or irrational thought patterns
  • Cognitive Inertia - The tendency to have a particular orientation in how an individual thinks about a problem, belief, or strategy in order to endure or resist change
  • Cognitive Psychology - Subdiscipline of Psychology
  • Cognitive traps for intelligence analysis
  • Cognitive vulnerability
  • Critical Thinking - The analysis of facts to form a judgment
  • Cultural knowledge
  • Emotional bias
  • Evolutionary Psychology - Applying the theory of evolution to identify which human psychological traits are adaptations
  • Anticipatory bias
  • Fallacy - argument using faulty thinking
  • False Consensus Effect - Attributional type of cognitive bias
  • Implicit stereotype
  • To jump to conclusions
  • List of cognitive biases - systematic patterns of deviation from the norm or rationality in assessment
  • Magical Thinking - The belief that unrelated events are causally related, even though there is no plausible causal link between them
  • Prejudice - Affective feelings about a person or thing based on perceived group membership
  • Presumption of Guilt - Presumption that a person is guilty of a crime
  • Rationality - The quality of agreeing with reason
  • Systemic Bias - Inherent tendency of a process to support certain outcomes
  • Theory loading


further reading

  • Eiser JR, van der Pligt J. (1988). Attitudes and decisions . London: Routledge. ISBN.
  • Fine C (2006). A mind of its own: How your brain warps and deceives . Cambridge, UK: Icon Books. ISBN.
  • Gilovich T (1993). How do we know what is not: the fallibility of human reason in everyday life . New York: The Free Press. ISBN.
  • Haselton MG, nettle D, Andrews PW (2005). "The Development of Cognitive Bias." (PDF). In Buss DM (ed.). Handbook of Evolutionary Psychology . Hoboken: Wiley. Pp. 724-746.
  • Heuer Jr. RJ (1999). "Psychology of Intelligence Analysis. Central Intelligence Agency".
  • Kahneman D (2011). Think, fast and slow . New York: Farrar, Straus and Giroux. ISBN.
  • Kida T (2006). Don't Believe Everything You Think: The 6 Basic Mistakes We Make While Thinking . New York: Prometheus. ISBN.
  • Krueger JI, Funder DC (June 2004). "Towards a Balanced Social Psychology: Causes, Consequences, and Remedies for the Problem-Seeking Approach to Social Behavior and Cognition". The behavioral and brain sciences . 27 (3): 313-27, discussion 328-76. doi: 10.1017 / s0140525x04000081. PMID 15736870.
  • Nisbett R., Ross L. (1980). Human Inference: Strategies and Deficiencies in Human Judgment . Englewood Cliffs, NJ: Prentice Hall. ISBN.
  • Piatelli-Palmarini M (1994). Inevitable Illusions: How Errors of Reason Rule Our Minds . New York: John Wiley & Sons. ISBN.
  • Stanovich K (2009). What Intelligence Tests Miss: The Psychology of Rational Thinking . New Haven (CT): Yale University Press. ISBN. Layperson's Summary (PDF) (November 21, 2010).
  • Tavris C, Aronson E (2007). Mistakes Were Made (But Not By Me): Why We Justify Stupid Beliefs, Bad Choices, and Harmful Actions . Orlando, Florida: Harcourt Books. ISBN.
  • Young S (2007). Micromessaging - Why Great Leadership is Indescribable . New York: McGraw-Hill. ISBN.

External links