The bias blind spot is the cognitive bias of recognizing the impact of biases on the judgement of others, while failing to see the impact of biases on one's own judgment. The term was created by Emily Pronin, a social psychologist from Princeton University's Department of Psychology, with colleagues Daniel Lin and Lee Ross.[better source needed] The bias blind spot is named after the visual blind spot. Most people appear to exhibit the bias blind spot. In a sample of more than 600 residents of the United States, more than 85% believed they were less biased than the average American. Only one participant believed that he or she was more biased than the average American. People do vary with regard to the extent to which they exhibit the bias blind spot. It appears to be a stable individual difference that is measurable (for a scale, see Scopelliti et al. 2015).
The bias blind spot appears to be a true blind spot in that it is unrelated to actual decision making ability. Performance on indices of decision making competence are not related to individual differences in bias blind spot. In other words, everyone seems to think they are less biased than other people, regardless of their actual decision making ability.
Bias blind spots may be caused by a variety of other biases and self-deceptions.
Self-enhancement biases may play a role, in that people are motivated to view themselves in a positive light. Biases are generally seen as undesirable, so people tend to think of their own perceptions and judgments as being rational, accurate, and free of bias. The self-enhancement bias also applies when analyzing our own decisions, in that people are likely to think of themselves as better decision makers than others.
People also tend to believe they are aware of "how" and "why" they make their decisions, and therefore conclude that bias did not play a role. Many of our decisions are formed from biases and cognitive shortcuts, which are unconscious processes. By definition, people are unaware of unconscious processes, and therefore cannot see their influence in the decision making process.
When made aware of various biases acting on our perception, decisions, or judgments, research has shown that we are still unable to control them. This contributes to the bias blind spot in that even if one is told that they are biased, they are unable to alter their biased perception.
Emily Pronin and Matthew Kugler have argued that this phenomenon is due to the introspection illusion. In their experiments, subjects had to make judgments about themselves and about other subjects. They displayed standard biases, for example rating themselves above the others on desirable qualities (demonstrating illusory superiority). The experimenters explained cognitive bias, and asked the subjects how it might have affected their judgment. The subjects rated themselves as less susceptible to bias than others in the experiment (confirming the bias blind spot). When they had to explain their judgments, they used different strategies for assessing their own and others' bias.
Pronin and Kugler's interpretation is that, when people decide whether someone else is biased, they use overt behaviour. On the other hand, when assessing whether they themselves are biased, people look inward, searching their own thoughts and feelings for biased motives. Since biases operate unconsciously, these introspections are not informative, but people wrongly treat them as reliable indication that they themselves, unlike other people, are immune to bias.
Pronin and Kugler tried to give their subjects access to others' introspections. To do this, they made audio recordings of subjects who had been told to say whatever came into their heads as they decided whether their answer to a previous question might have been affected by bias. Although subjects persuaded themselves they were unlikely to be biased, their introspective reports did not sway the assessments of observers.
People tend to attribute bias in an uneven way. When people reach different perceptions, they tend to label one another as biased while labelling themselves as accurate and unbiased. Pronin hypothesizes that this bias misattribution may be a source of conflict and misunderstanding between people. For example, in labeling another person as biased, one may also label their intentions cynically. But when examining one's own cognitions, people judge themselves based on their good intentions. It is likely that in this case, one may attribute another's bias to "intentional malice" rather than an unconscious process.
Pronin also hypothesizes ways to use awareness of the bias blind spot to reduce conflict, and to think in a more "scientifically informed" way. Although we are unable to control bias on our own cognitions, one may keep in mind that biases are acting on everyone. Pronin suggests that people might use this knowledge to separate other's intentions from their actions.
Initial evidence suggests that the bias blind spot is not related to actual decision-making ability. Participants who scored better or poorer on various tasks associated with decision making competence were no more or less likely to be higher or lower in their susceptibility to bias blind spot. Bias blind spot does, however, appear to increase susceptibility to related biases. People who are high in bias blind spot are more likely to ignore the advice of other people, and are less likely to benefit from training geared to reduce their commission of other biases.